This application claims benefit of priority to Chinese Patent Application No. 202010598429.0 filed on Jun. 28, 2020 before the China National Intellectual Property Administration, the entire disclosure of which is incorporated herein by reference in its entity.
The present application relates to a technical field of recognizing motion pattern of a limb, for example, recognizing motion pattern of lower limbs and prostheses, orthoses or exoskeletons of a human body.
With the progress of science and technology and the improvement of human living standard, the research and development of rehabilitation medical equipment for people is gradually and increasingly concerned by the society and government. Recently, there has been a significant increase in the demand for human power aids or medical rehabilitation training equipment in stroke hemiplegia, impaired motion function of lower limb or disabled persons. The lower limb rehabilitation training equipment may help stroke hemiplegic patients or motion-function-impaired patients to regain the walking ability and thus improve the quality of life. In addition, the lower limb rehabilitation training equipment may also help to restore the motion function to injured muscles or joints and reduce or eliminate permanent physical impairment. In addition, some researchers are working on the development of various intelligent human power aids for soldiers or heavy-duty carriers, hoping to greatly improve the weight-bearing capacity for the users while reducing their walking or work burden.
It has been noted that in different motion patterns, such as upslope, downslope, upstairs or downstairs, the function performed by each joint of the lower limb of human body and the corresponding biomechanical characteristics vary considerably. Therefore, in order to achieve the desired function more accurately, the lower limb auxiliary device firstly should be able to accurately recognize the motion pattern of the user (wearer), and then control a driver to generate a preset auxiliary torque according to the corresponding motion pattern, thereby assisting the wearer to perform the desired action more easily.
In order to realize the motion pattern recognition functions of the human lower limb, the lower limb orthopedic device and the exoskeleton as described above, a variety of implementation methods have been proposed. Some researchers have proposed to detect the motion pattern of a wearer who wears the lower limb auxiliary device in real time by extracting and analyzing the electromyographic (EMGs) or electroencephalographic (EEG) signals of is the wearer. However, the recognition accuracy of such methods is greatly reduced due to the fact that muscles are prone to fatigue and body sweating during long-term exercise. Furthermore, the EEG signals have a plurality of dimensions and the computation load for the EEG signals is heavy, and thus it is difficult to realize real-time pattern recognition on mobile devices at present. In addition, it has been proposed to analyze the motion pattern of the wearer based on the pressure signal of a foot of the wearer.
However, it should be noted that when the ground surface is uneven or the walking speed of the wearer is changed, the performance of such pattern recognition method will be greatly degraded, and therefore, it is difficult to be widely used in real scenarios. According to another prior art, it has been proposed to use dynamic information obtained by an inertial measurement unit fixed to the tendon side or embedded in the prosthesis for motion intention recognition of the prosthesis. However, considering that the dynamic information in the sensor reference coordinate system, which is obtained during the motion, is related to the motion speed of the wearer, it would be difficult to popularize this method in practical application.
The disclosures of the present application propose methods for recognizing motion pattern of limbs and prostheses, orthoses or exoskeletons of the human body.
In one aspect of the present application, there is provided a method for recognizing a motion pattern of a limb, and the method may comprise: collecting, by a sensor, motion data of a limb extremity end of a subject during a swing stage of the extremity end in different motion modes; training a classifier or a pattern recognizer by inputting the collected motion data and corresponding limb motion patterns into the classifier or the pattern recognizer to train; and recognizing the motion pattern of the limb by inputting the motion data of the limb, which is obtained in real time by the sensor, into the trained classifier or the trained pattern recognizer.
According to exemplary embodiments of the present application, wherein the limb may for example comprise a lower limb, lower limb prosthesis, a lower limb orthosis, a lower limb exoskeleton of a human body or the like, and the motion patterns may for example comprise upslope, downslope, upstairs, downstairs, walking on flat ground, turning and the like.
According to an exemplary embodiment of the present application, the motion data may comprise one or more of an absolute motion trajectory to ground, an absolute velocity to ground, and an absolute acceleration to ground of the limb extremity end during the swing stage in the different motion modes.
According to exemplary embodiments of the present application, the sensor may comprise an inertial measurement unit fixed to the limb extremity end. The method may further comprise: obtaining one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground through a coordinate transformation and an integration (e.g., first order integration or second order integration) of angular velocity and acceleration data of the inertial measurement unit, which are obtained in a sensor coordinate system.
According to exemplary embodiments of the present application, the method may further comprise resetting, when the subject is in a standing stage, a transformation matrix for the coordinate transformation, the absolute velocity to ground, and an absolute motion displacement to ground, to eliminate or reduce a cumulative drift or cumulative error of the inertial measurement unit.
According to exemplary embodiments of the present application, wherein the method may further comprise detecting the standing stage of the subject by the inertial measurement unit fixed at the limb extremity end or a load cell mounted on a foot of the subject.
According to exemplary embodiments of the present application, the is collecting motion data may comprise extracting the absolute motion trajectory to ground of the limb extremity end in a sagittal plane, or deriving terrain slopes corresponding to the different motion patterns from the absolute motion trajectory to ground in the sagittal plane to recognize the motion pattern being performed.
According to exemplary embodiments of the present application, the method may further comprise triggering, based on a trigger boundary condition, the trained classifier or the trained pattern recognizer to recognize the motion pattern performed by the subject before a foot of the subject touching ground. The motion pattern of the subject can be recognized in response to the trigger boundary condition being satisfied.
According to exemplary embodiments of the present application, the trigger boundary condition may for example comprise an elliptical boundary condition, a circular boundary condition, or a rectangular boundary condition. The motion pattern of the subject can be recognized when the absolute motion trajectory to ground of the limb extremity end passes through the trigger boundary condition.
According to exemplary embodiments of the present application, the trigger boundary condition may for example further comprise one or more of a time threshold trigger, an absolute displacement to ground trigger in a forward direction or a direction vertical to ground, an absolute velocity to ground trigger, or an absolute acceleration to ground trigger.
According to exemplary embodiments of the present application, the trigger boundary condition may comprise one or more of the angular velocity or acceleration signals of the inertial measurement unit in the sensor coordinate system satisfy a preset trigger condition.
According to exemplary embodiments of the present application, the method may further comprise detecting, based on a time window, the motion pattern of the subject in real time to recognize the motion pattern performed by the subject before a foot of the subject touches the ground. The motion pattern of the subject can be recognized in response to one or more of the absolute velocity to ground, the absolute acceleration to ground, or the absolute motion trajectory to ground matching, within the time window, a corresponding data of a particular motion pattern.
According to exemplary embodiments of the present application, wherein collecting motion data of the limb extremity end of the subject during the swing stage in different motion modes may comprise calculating a rotation angle or angular velocity of the limb extremity end relative to an initial sagittal plane or an initial coronal plane of the subject to recognize turning activity of the subject.
According to exemplary embodiments of the present application, the method may further comprise obtaining the rotation angle or angular velocity of the limb extremity end relative to the initial sagittal plane or the initial coronal plane of the subject by converting output data of the inertial measurement unit fixed to the limb extremity end, or recognizing the turning activity of the subject by detecting the rotation angle or angular velocity of other parts (e.g., head, upper torso, arms, lower thighs, lower legs, feet, etc.) of the body of the subject relative to the initial sagittal plane or the initial coronal plane of the subject.
According to exemplary embodiments of the present application, the classifier or the pattern recognizer for motion pattern recognition of the limb may comprise, for example, a linear discriminant analyzer, a quadratic discriminant analyzer, a support vector machine, a neural network, or the like, but the present application is not limited thereto.
According to exemplary embodiments of the present application, the sensor may further comprise an inertial measurement unit-combined laser displacement sensor mounted on lower legs, thighs, waists, head or other portion of the subject. The inertial measurement unit-combined laser displacement sensor may be configured to measure one or more of the absolute motion trajectory to ground, the absolute velocity to ground, or the acceleration to ground, or measure directly topographic characteristics in the different motion patterns.
According to exemplary embodiments of the present application, the sensor may further comprise an inertial measurement unit-combined depth camera mounted to lower legs, thighs, waists, head or other portion of the subject. The inertial measurement unit-combined depth camera may be configured to measure one or more of the absolute motion trajectory to ground, the absolute velocity to ground, or the acceleration to ground, or measure directly topographic characteristics in the different motion patterns.
According to exemplary embodiments of the present application, the sensor may further comprise an infrared capture system mounted in an ambient environment of the subject, and an infrared capture marker point is mounted at the limb extremity end of the subject. The method may further comprise analyzing one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground of the infrared capture marker point to recognize the motion pattern of the subject.
According to exemplary embodiments of the present application, the method may further comprise recognizing the different motion patterns by combining one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground with a foot pressure distribution of the subject, a rotation angle of a lower limb knee joint or ankle joint, an electromyographic signal or an electroencephalographic signal (EEG) of the subject.
According to exemplary embodiments of the present application, the method may further comprise recognizing the different motion patterns by combining one or more of the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground with an angular velocity or an acceleration in a sensor coordinate system measured by an inertial measurement unit fixed at the limb extremity end.
In another aspect of the present application, there is provided a non-transitory machine-readable medium having instructions stored therein, which when executed by a processor, cause the processor to perform operations of collecting, by a sensor, motion data of a limb extremity end of a subject during a swing stage of the extremity end in different motion modes; inputting the collected motion data and corresponding limb motion patterns into a classifier or a pattern recognizer to train the classifier or the pattern recognizer; and inputting the motion data of the limb, which is obtained in real time by the sensor, into the trained classifier or the trained pattern recognizer to perform motion pattern recognition of the limb.
In another aspect of the present application, there is provided a data processing system comprising a processor and a memory, wherein the memory is coupled to the processor to store instructions which, when executed by the processor, cause the processor to perform operations of collecting, by a sensor, motion data of a limb extremity end of a subject during a swing stage in different motion modes; inputting the collected motion data and corresponding limb motion patterns into a classifier or a pattern recognizer to train the classifier or the pattern recognizer; and inputting the motion data of the limb, which is obtained in real time by the sensor, into the trained classifier or the trained pattern recognizer to perform motion pattern recognition of the limb.
Other features and aspects of the present application will be apparent from the following detailed description, the drawings, and the claims.
The principles of the inventive concept are illustrated below by describing non-limiting embodiments of the present disclosure in conjunction with the accompanying drawings. It should be understood that the drawings are intended to illustrate, rather than limit the exemplary embodiments of the present disclosure. The accompanying drawings are included to provide a further understanding of the general concept of the present disclosure, and are incorporated in the specification to constitute a part thereof. The same reference numerals in the drawings denote the same features. In the accompanying drawings:
For a better understanding of the present disclosure, various aspects of the present disclosure will be described in more detail with reference to the exemplary embodiments illustrated in the accompanying drawings. It should be understood that the detailed description is merely an illustration of the exemplary embodiments of the present disclosure rather than a limitation to the scope of the present disclosure in any way. Throughout the specification, like reference numerals refer to like elements. The expression “and/or” includes any and all combinations of one or more of the associated listed items.
In the accompanying drawings, the thicknesses, sizes and shapes of the components have been slightly exaggerated for the convenience of explanation. The accompanying drawings are merely illustrative and not strictly drawn to scale.
It should be understood that the terms “comprising”, “including”, “having” and variants thereof, when used in the specification, specify the presence of stated features, elements, components and/or steps, but do not exclude the presence or addition of one or more other features, elements, components, steps and/or combinations thereof. In addition, expressions, such as “at least one of”, when preceding a list of listed features, modify the entire list of features rather than an individual element in the list. Further, the use of “may”, when describing the embodiments of the present disclosure, relates to “one or more embodiments of the present disclosure”. Also, the term “exemplary” is intended to refer to an example or illustration of the embodiment.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by those skilled in the art to which the present disclosure belongs. It should be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless explicitly so defined herein.
The various aspects of the present disclosure are described in more detail below with reference to the accompanying drawings and in conjunction with specific embodiments, but the embodiments of the present disclosure are not limited thereto.
As shown in
The present invention proposes to derive the slope of the ground of the corresponding terrain based on the absolute motion trajectory to ground of the lower limb extremity end of human body, thereby distinguishing or predicting the performed motion pattern.
The method according to an embodiment of the present application is a pattern recognition or classification method based on parameter training.
During the data collection process, a certain number of subjects are required to repeat several common motion patterns in daily life according to an experimental protocol, for example, as shown in
In this step, one or more of the absolute motion trajectory to ground, the absolute velocity to ground, or the absolute acceleration to ground of the lower limb extremity end of human body in various daily motion patterns can be measured directly or indirectly by means of a sensor installed on the human body or in the environment surrounding the human body, and then the measured data is input to a pattern recognizer or a classifier so as to realize the detection of the motion pattern (e.g., upslope, downslope, upstairs, downstairs, and flat ground walking) of human body.
In an exemplary embodiment, the above-described sensor 2 for measuring the absolute motion trajectory to ground of the lower limb extremity end of human body may be, for example, an inertial measurement unit mounted at the lower limb extremity end of human body, such as any position of the ground-proximal end of the lower leg, the heel, the toe, or the foot, or mounted at a corresponding position of a lower limb, prosthesis, orthosis, or an exoskeleton. The inertial measurement unit can obtain the is angular velocity and the acceleration in the sensor coordinate system, and conversion matrixes, attitude angle of the sensor, the absolute motion trajectory to ground at the lower limb extremity end, the absolute velocity to ground, and the absolute acceleration to ground can be obtained by performing a coordinate transformation, a first order integration, and a second order integration.
In another exemplary embodiment, the above-described sensor 2 for measuring the absolute motion trajectory to ground of the lower limb extremity end of human body may be, for example, an inertial measurement unit-combined laser displacement sensor mounted at the lower limb extremity end of human body or a corresponding position on a prosthesis, an orthosis or an exoskeleton for the lower limb. The inertial measurement unit-combined laser displacement sensor may be mounted at other portions of the human body, such as head, waists, thighs, or lower legs. In addition, the inertial measurement unit-combined laser displacement sensor may be used to directly measure terrain features, thereby recognizing the motion pattern performed by human body.
In yet another exemplary embodiment, the above-described sensor for measuring the absolute motion trajectory to ground of the lower limb extremity end of human body may be a depth camera mounted at the lower limb extremity end of human body or a corresponding position on a prosthesis, an orthosis or an exoskeleton for the lower limb. The depth camera may be mounted to other portions of the human body, such as head, waists, thighs, or lower legs. In addition, the depth camera may be used to directly measure terrain features, thereby recognizing the motion pattern performed by human body.
It should be noted that when the sensor 2 is an inertial measurement unit-combined laser displacement sensor or an inertial measurement unit-combined depth camera, the sensor 2 may, for example, be mounted on other body parts such as a lower leg, a thigh, a waist or a head for measuring ground features, thereby recognizing or classifying terrain types and recognizing the motion pattern of the lower limb of human body.
In addition, one or more of the measured absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground of the lower limb extremity end may be combined with one or more of the acceleration and the angular velocity in the sensor coordinate system obtained by the inertial measurement unit mounted at the lower limb extremity end to recognize the motion pattern for human body.
In addition, although not shown in the drawings, one or more of the measured absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground of the lower limb extremity end may be combined with signal such as human foot pressure distribution signal, EMG signal, EEG signal, rotation angle of each joint of the lower limb or the like to improve the recognition accuracy of the existing motion pattern recognizer.
In yet another exemplary embodiment, the above-described sensor for measuring the absolute motion trajectory to ground of the lower limb extremity end of human body may be a motion capture system installed in the environment surrounding the human body. In this case, it is necessary to dispose capture marker points at the lower limb extremity end of human body. In addition, it is also possible to dispose capture marker points at other parts of the lower limb of human body, such as a knee joint, an ankle joint, a lower leg, a thigh, or the like. The dynamic capture system may obtain the absolute motion trajectory to ground, the absolute velocity to ground, and the absolute acceleration to ground of the capture marker points for recognizing the motion pattern of human body.
Step S104: Training the Pattern Recognizer or Classifier
Referring again to
In an exemplary embodiment, the collected motion data of the extremity end of the subject in the swing stage of different motion patterns can be classified or recognized using common pattern recognition methods. For example, the motion data may be classified or recognized by a linear discriminant analyzer, a secondary discriminant analyzer, a support vector machine, or a neural network. The motion data may include, for example, an absolute motion trajectory to ground, an absolute velocity to ground, an absolute acceleration to ground, and the like. It is also possible to perform data processing to the above motion data (e.g., the absolute motion trajectory to ground) to obtain a corresponding slope of the ground, thereby recognizing the motion pattern performed by the lower limb of human body.
Referring again to
In order to eliminate the drift or the accumulated error that may occur in the later data processing of the output signal of the inertial measurement unit, it is necessary to correct and reset the conversion matrix, the absolute velocity to ground, the absolute motion trajectory to ground, and the like at the standing stage of the lower limb of human body. The standing stage may be detected by the output signal of the inertial measurement unit, or by the pressure sensor of the foot or the axial force sensor of the orthopedic, prosthetic limb and exoskeleton.
Wherein, αf represents the acceleration of the inertial measurement unit, αg represents the acceleration of gravity, and ξf is a predetermined threshold value.
When the absolute value of the acceleration signal of the inertial measurement unit obtained by measurement is close to the gravity acceleration for a period of time, the lower limb of human body is considered to be in the standing stage. To eliminate the accumulated error of the inertial measurement unit, the conversion matrix of the sensor, the absolute displacement to ground, and the absolute velocity to ground are reset. When the absolute value of the acceleration signal obtained by the measurement is greater than the gravity acceleration, the lower limb of human body is in the swing state, and the absolute motion displacement to ground and the absolute velocity to ground of the lower limb extremity end, and the conversion matrix are updated.
As shown in
In order to be able to recognize the motion pattern of the lower limb, prosthesis, orthosis or exoskeleton before the next foot contacting the ground, thereby enabling the lower limb, prosthesis, orthosis or exoskeleton to complete the required preparation during the swing stage, a predetermined triggering condition may be used to trigger the pattern recognition decision of the classifier or the pattern recognizer. For example, during downstairs, the human ankle needs to extend during the swing stage. In order to reduce the impact force by bending the ankle joint to cushion the collision during the foot contacting the ground, a triggering boundary condition for pattern recognition may be employed. In addition, a data window may be used to match the input data with the corresponding data in a specific pattern in real time to realize real-time detection of the motion pattern of the lower limb.
AX
g
2
+By
g
2=1 (2)
Where A and B are constants, and xg and yg are coordinates of the obtained absolute motion trajectory in the x-axis direction and the y-axis direction.
As shown in
In addition to the above elliptical boundary conditions, the boundary conditions according to exemplary embodiments of the present application may further include:
(1) Boundary conditions such as circles, rectangles, and the like, i.e., pattern recognition is triggered when the above absolute motion trajectory to ground passes through boundary conditions such as circles, rectangles, or the like;
(2) A time threshold value that triggers pattern recognition at a predetermined point in time;
(3) A displacement threshold value of the limb extremity end in a forward direction or a direction vertical to the ground;
(4) An acceleration threshold or angular velocity threshold in the sensor coordinate system;
In addition to the pattern recognition triggering condition described above, a data window may be used to monitor the motion pattern in real time to ensure that the motion pattern is predicted before the next foot contacting the ground.
Where, ks(t) is the slope of ground obtained at time t; ksa is a threshold value for distinguishing the upstairs motion pattern from the upslope motion pattern; kus is a threshold value for distinguishing the upslope motion pattern from the flat ground walking motion pattern; kds is a threshold value for distinguishing the flat ground walking motion pattern from the downslope motion pattern; and ksd is a threshold value for distinguishing the downslope motion pattern from the downstairs motion pattern.
It should be noted that in order to realize motion pattern recognition, common classification patterns, for example, any one of a linear discriminant analyzer, a secondary discriminant analyzer, a support vector machine, or a neural network, may also be used to process one or more of the above obtained conversion matrix, the absolute displacement to ground, the absolute velocity to ground, and the absolute acceleration to ground.
In an exemplary embodiment, in order to detect turning activity of human body, the rotation angle or angular velocity of the human head, upper torso, arm, thigh, lower leg, foot or other parts of the body relative to the initial sagittal or coronal plane of the human body during turning may also be measured. The rotation angle or angular velocity can be measured by using the inertial measurement unit installed at a corresponding part of the human body, or the conversion matrix can be obtained by detecting the angular velocity and the acceleration in the sensor coordinate system, thereby obtaining the rotation angle or angular velocity.
Thresholds αR and αL may be determined by training the pattern recognizer or the classifier, and the turning motion of the human body may be recognized using the trained pattern recognizer or classifier.
Further, although not shown, in the exemplary embodiment, the turning motion of the human body may also be recognized based on the angular velocity of the lower limb extremity end of human body relative to the initial sagittal plane or the initial coronal plane of the human body obtained by detection.
In one or more embodiments, aspects of the present patent document may be directed to, may include, or may be implemented on one or more information handling systems (or computing systems). An information handling system/computing system may include any instrumentality or aggregate of instrumentalities operable to compute, calculate, determine, classify, process, transmit, receive, retrieve, originate, route, switch, store, display, communicate, manifest, detect, record, reproduce, handle, or utilize any form of information, intelligence, or data. For example, a computing system may be or may include a personal computer (e.g., laptop), tablet computer, mobile device (e.g., personal digital assistant (PDA), smart phone, phablet, tablet, etc.), smart watch, server (e.g., blade server or rack server), a network storage device, camera, or any other suitable device and may vary in size, shape, performance, functionality, and price. The computing system may include random access memory (RAM), one or more processing resources such as a central processing unit (CPU) or hardware or software control logic, read only memory (ROM), and/or other types of memory. Additional components of the computing system may include one or more disk drives, one or more network ports for communicating with external devices as well as various input and output (I/O) devices, such as a keyboard, mouse, stylus, touchscreen and/or video display. The computing system may also include one or more buses operable to transmit communications between the various hardware components.
It will be understood that the functionalities shown for system 600 may operate to support various embodiments of a computing system—although it shall be understood that a computing system may be differently configured and include different components, including having fewer or more components as depicted in
As illustrated in
A number of controllers and peripheral devices may also be provided, as shown in
In the illustrated system, all major system components may connect to a bus 616, which may represent more than one physical bus. However, various system components may or may not be in physical proximity to one another. For example, input data and/or output data may be remotely transmitted from one physical location to another. In addition, programs that implement various aspects of the disclosure may be accessed from a remote location (e.g., a server) over a network. Such data and/or programs may be conveyed through any of a variety of machine-readable medium including, for example: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, other non-volatile memory (NVM) devices (such as 3D XPoint-based devices), and ROM and RAM devices.
Aspects of the present disclosure may be encoded upon one or more non-transitory computer-readable media with instructions for one or more processors or processing units to cause steps to be performed. It shall be noted that the one or more non-transitory computer-readable media shall include volatile and/or non-volatile memory. It shall be noted that alternative implementations are possible, including a hardware implementation or a software/hardware implementation. Hardware-implemented functions may be realized using ASIC(s), programmable arrays, digital signal processing circuitry, or the like. Accordingly, the “means” terms in any claims are intended to cover both software and hardware implementations. Similarly, the term “computer-readable medium or media” as used herein includes software and/or hardware having a program of instructions embodied thereon, or a combination thereof. With these implementation alternatives in mind, it is to be understood that the figures and accompanying description provide the functional information one skilled in the art would require to write program code (i.e., software) and/or to fabricate circuits (i.e., hardware) to perform the processing required.
It shall be noted that embodiments of the present disclosure may further relate to computer products with a non-transitory, tangible computer-readable medium that have computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present disclosure, or they may be of the kind known or available to those having skill in the relevant arts. Examples of tangible computer-readable media include, for example: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, other non-volatile memory (NVM) devices (such as 3D XPoint-based devices), and ROM and RAM devices. Examples of computer is code include machine code, such as produced by a compiler, and files containing higher level code that are executed by a computer using an interpreter. Embodiments of the present disclosure may be implemented in whole or in part as machine-executable instructions that may be in program modules that are executed by a processing device. Examples of program modules include libraries, programs, routines, objects, components, and data structures. In distributed computing environments, program modules may be physically located in settings that are local, remote, or both.
One skilled in the art will recognize no computing system or programming language is critical to the practice of the present disclosure. One skilled in the art will also recognize that a number of the elements described above may be physically and/or functionally separated into modules and/or sub-modules or combined together.
In the foregoing specification, embodiments of the disclosure have been described with reference to specific exemplary embodiments thereof. It will be evident that various modifications may be made thereto without departing from the broader spirit and scope of the disclosure as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.
Number | Date | Country | Kind |
---|---|---|---|
202010598429.0 | Jun 2020 | CN | national |