This application claims the priority benefit of Taiwan application serial no. 106140053, filed on Nov. 20, 2017. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
The disclosure relates to a wearable device, and more particularly, to a wearable device capable of recognizing sleep stage and a recognition method thereof.
In the technical field for recognizing sleep stage, the common sleep stage recognition is usually conducted by assessment with a polysomnography (PSG) system through multiple physiological sensing operations. However, the physiological sensing operations include, for example, electroencephalography (EEG), electrooculography (EOG), electrocardiogram (ECG), electromyography (EMG), respiratory Effort, air flow, blood pressure, blood oxygen saturation (SaO2), and heart rate and sleep gesture. In other words, the common sleep stage recognition requires complicated accessories to be worn by a wearer and requires analysis on a large amount of sense data. In consideration of the above, providing a wearable device capable of effectively sensing a sleep stage of the wearer having characteristics of convenience is one of important issues to be addressed in the field.
The disclosure provides a wearable device capable of recognizing sleep stage and a method thereof, which can effectively recognize the sleep stage of the wearer and provide characteristics of convenience.
A wearable device capable of recognizing sleep stage of the disclosure includes a processor, an electrocardiogram sensor, an acceleration sensor and an angular acceleration sensor. The processor is configured to train a neural network module. The electrocardiogram sensor is coupled to the processor. The electrocardiogram sensor generates an electrocardiogram signal. The processor analyzes the electrocardiogram signal to generate a plurality of first characteristic values. The acceleration sensor generates an acceleration signal. The processor analyzes the acceleration signal to generate a plurality of second characteristic values. The angular acceleration sensor generates an angular acceleration signal. The processor analyzes the angular acceleration signal to generate a plurality of third characteristic values. The processor utilizes the trained neural network module to perform a sleep stage recognition operation according to the first characteristic values, the second characteristic values and the third characteristic values, so as to obtain a sleep stage recognition result.
A recognition method of sleep stage is adapted to a wearable device. The wearable device includes a processor, an electrocardiogram sensor, an acceleration sensor and an angular acceleration sensor. The method includes: training a neural network module by the processor; generating an electrocardiogram signal by the electrocardiogram sensor, and analyzing the electrocardiogram signal by the processor to generate a plurality of first characteristic values; generating an acceleration signal by the acceleration sensor, and analyzing the acceleration signal by the processor to generate a plurality of second characteristic values; generating an angular acceleration signal by the angular acceleration sensor, and analyzing the angular acceleration signal by the processor to generate a plurality of third characteristic values; and utilizing the trained neural network module by the processor to perform a sleep stage recognition operation according to the first characteristic values, the second characteristic values and the third characteristic values, so as to obtain a sleep stage recognition result.
Based on the above, the wearable device capable of recognizing sleep stage and the method thereof according to the disclosure can sense a plurality of characteristic values by the electrocardiogram sensor, the acceleration sensor and the angular acceleration sensor. In this way, the processor can utilize the trained neural network module to perform the sleep stage recognition operation according to the characteristic values. As a result, the wearable device of the disclosure can effectively obtain a recognition result of the sleep stage and provide characteristics of convenience.
To make the above features and advantages of the disclosure more comprehensible, several embodiments accompanied with drawings are described in detail as follows.
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure.
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
In order to make content of the disclosure more comprehensible, embodiments are provided below to describe the disclosure in detail, however, the disclosure is not limited to the provided embodiments, and the provided embodiments can be suitably combined. Moreover, elements/components/steps with same reference numerals represent same or similar parts in the drawings and embodiments.
In the present embodiment, the electrocardiogram sensor 130 is configured to sense an electrocardiogram signal ECG and provide the electrocardiogram signal to the processor 110. The processor 110 analyzes the electrocardiogram signal to generate a plurality of first characteristic values. In the present embodiment, the acceleration sensor 140 is configured to sense an acceleration caused by moving of the wearer's body or wearing part and provide an acceleration signal to the processor 110. The processor 110 analyzes the acceleration signal to generate a plurality of second characteristic values. In the present embodiment, the angular acceleration sensor 150 is configured to sense an angular acceleration caused by turning of the wearer's body or wearing part and provide an angular acceleration signal to the processor 110. The processor 110 analyzes the angular acceleration signal to generate a plurality of third characteristic values. In other words, when the wearer is asleep, because the wearer may intentionally or unintentionally turn over or move his/her body, the wearable device 100 of the present embodiment can also sense the effect of turning or moving of the wearing part of the wearer in addition to sensing the electrocardiogram information of the wearer, such an analysis result provided by the processor 110 can take the effect caused by actions of the wearer in consideration.
In the present embodiment, the processor 110 is, for example, a central processing unit (CPU), a system on chi (SOC) or other programmable devices for general purpose or special purpose, such as a microprocessor and a digital signal processor (DSP), a programmable controller, an application specific integrated circuit (ASIC), a programmable logic device (PLD) or other similar devices or a combination of above-mentioned devices.
In the present embodiment, the storage device 120 is, for example, a dynamic random access memory (DRAM), a flash memory or a non-volatile random access memory (NVRAM). In the present embodiment, the storage device 120 is configured to store data and program modules described in each embodiment of the disclosure, which can be read and executed by the processor 110 so the wearable device 100 can realize the recognition method of sleep stage described in each embodiment of the disclosure.
In the present embodiment, the storage device 120 further includes a neural network module 121. The processor 110 can sense a plurality of characteristic values from different wearers in advance by the electrocardiogram sensor 130, the acceleration sensor 140 and the angular acceleration sensor 150, and use the characteristic values from the different wearers as sample data. In the present embodiment, the processor 110 can create a prediction model according to determination conditions, algorithms and parameters from the sleep stage, and use a plurality of the sample data for training or correcting the prediction model. Accordingly, when a sleep stage recognition operation is performed by the wearable device 100 for the wearer, the processor 110 can utilize the trained neural network module 121 to obtain a recognition result according to the first characteristic values, the second characteristic values and the third characteristic values sensed by the electrocardiogram sensor 130, the acceleration sensor 140 and the angular acceleration sensor 150. Nevertheless, enough teaching, suggestion, and implementation illustration regarding algorithms and calculation modes for the trained neural network module 121 of the present embodiment may be obtained with reference to common knowledge in the related art, which is not repeated hereinafter.
In the present embodiment, the heart rate variability analysis module 320 analyzes the R-wave signals, so as to obtain a low frequency signal LF, a high frequency signal HF, a detrended fluctuation analysis signal DFA, a first sample entropy signal SE1 and a second sample entropy signal SE2 as shown in
In the present embodiment, the R-wave amplitude analysis module 330 analyzes the R-wave signals, so as to obtain a result including a turning point ratio value EDR_TPR and a signal strength value EDR_BR as shown in
In the present embodiment, the X-axis acceleration analysis module 611 can generate a first acceleration detrended fluctuation analysis value AX_DFA and a first acceleration sample entropy value AX_SE1. The Y-axis acceleration analysis module 612 can generate a second acceleration detrended fluctuation analysis value AY_DFA and a second acceleration sample entropy value AY_SE2. The Z-axis acceleration analysis module 613 can generate a third acceleration detrended fluctuation analysis value AZ_DFA and a third acceleration sample entropy value AZ_SE3. In other words, the second characteristic values of the present embodiment include the first acceleration detrended fluctuation analysis value AX_DFA, the first acceleration sample entropy value AX_SE1, the second acceleration detrended fluctuation analysis value AY_DFA, the second acceleration sample entropy value AY_SE2, the third acceleration detrended fluctuation analysis value AZ_DFA and the third acceleration sample entropy value AZ_SE3.
Nevertheless, enough teaching, suggestion, and implementation illustration regarding details of the related calculations for the acceleration detrended fluctuation analysis values and the acceleration sample entropy values of the present embodiment may be obtained with reference to common knowledge in the related art, which is not repeated hereinafter.
In the present embodiment, the X-axis angular acceleration analysis module 811 can generate a first angular acceleration detrended fluctuation analysis value AnY_DFA and a first angular acceleration sample entropy value AnX_SE1. The Y-axis angular acceleration analysis module 812 can generate a second angular acceleration detrended fluctuation analysis value AnY_DFA and a second angular acceleration sample entropy value AnY_SE2. The Z-axis angular acceleration analysis module 813 can generate a third angular acceleration detrended fluctuation analysis value AnZ_DFA and a third angular acceleration sample entropy value AnZ_SE3. In other words, the third characteristic values of the present embodiment include the first angular acceleration detrended fluctuation analysis value AnX_DFA, the first angular acceleration sample entropy value AnX_SE1, the second angular acceleration detrended fluctuation analysis value AnY_DFA, the second angular acceleration sample entropy value AnY_SE2, the third angular acceleration detrended fluctuation analysis value AnZ_DFA and the third angular acceleration sample entropy value AnZ_SE3.
Nevertheless, enough teaching, suggestion, and implementation illustration regarding details of the related calculations for the angular acceleration detrended fluctuation analysis values and the angular acceleration sample entropy values of the present embodiment may be obtained with reference to common knowledge in the related art, which is not repeated hereinafter.
In this way, the recognition method of the present embodiment can recognize the sleep stage of the wearer according to said 21 characteristic values sensed by the electrocardiogram sensor 130, the acceleration sensor 140 and the angular acceleration sensor 150. In the present embodiment, the sleep stage recognition result is one of the wakefulness stage, the first non-rapid eye movement stage, the second non-rapid eye movement stage, the third non-rapid eye movement stage and the rapid eye movement stage. Also, the wakefulness stage, the first non-rapid eye movement stage, the second non-rapid eye movement stage, the third non-rapid eye movement stage and the rapid eye movement stage are established by a polysomnography standard.
In addition, sufficient teaching, suggestion, and implementation regarding detailed features of each module and each sensor in the wearable device 100 of the present embodiment the disclosure may be obtained from the foregoing embodiments of
In summary, the wearable device capable of recognizing sleep stage and the recognition method of sleep stage according to the disclosure can provide an accurate sleep recognition function. The wearable device includes the electrocardiogram sensor, the acceleration sensor and the angular acceleration sensor so the wearable device can sense the electrocardiogram information and posture variation of the wearer. The wearable device of the disclosure can use the trained neural network module to perform the sleep stage recognition operation according to the characteristic values provided by the electrocardiogram sensor, the acceleration sensor and the angular acceleration sensor. Moreover, the wearable device of the disclosure can integrate each of the sensors into one wearable item instead of complicated accessories. As a result, the wearable device of the disclosure is suitable for the wearer to conveniently and effectively monitor the sleep state in a home environment so as to obtain the recognition result of the sleep stage of the wearer.
Although the present disclosure has been described with reference to the above embodiments, it will be apparent to one of ordinary skill in the art that modifications to the described embodiments may be made without departing from the spirit of the disclosure. Accordingly, the scope of the disclosure will be defined by the attached claims and not by the above detailed descriptions.
Number | Date | Country | Kind |
---|---|---|---|
106140053 | Nov 2017 | TW | national |