This application claims the priority benefit of Taiwan application serial no. 107143784, filed on Dec. 5, 2018. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
The disclosure relates to a physiological status evaluation technology, and particularly relates to a method and a system for evaluating cardiac status, an electronic device and an ultrasonic scanning device.
Cardiac ultrasonic image may reflect a structure and a function of a heart, for example, to indicate a size, a contraction status of the heart and/or a heart valve activity. The cardiac ultrasonic image may be a two-dimensional (2D) image or a three-dimensional (3D) image. Information provided by a 2D ultrasonic image is obviously less than information provided by a 3D ultrasonic image. For example, the 2D ultrasonic image cannot provide depth information of the image, and the 3D ultrasonic image has integral depth information, so as to more accurately evaluate a cardiac status. However, equipment for capturing the 3D ultrasonic image is very expensive, which is not popularized in use. Therefore, how to more conveniently provide evaluation information of the cardiac status based on the 2D ultrasonic image is one of subjects studied by those skilled in the art in the technical field.
The disclosure is directed to a method and a system for evaluating cardiac status, an electronic device and an ultrasonic scanning device, which are adapted to automatically evaluate a cardiac status of a user based on 2D ultrasonic images, so as to effectively ameliorate a usage rate of a 2D ultrasonic scanning device.
An embodiment of the disclosure provides a method for evaluating cardiac status including: obtaining at least one first image, wherein each of the at least one first image is a two-dimensional image and includes a first cardiac pattern; training a depth learning model by using the first image; and analyzing at least one second image by using the trained depth learning model to automatically evaluate a cardiac status of a user, wherein each of the at least one second image is the two-dimensional image and includes a second cardiac pattern.
An embodiment of the disclosure provides an electronic device including a storage device and a processor. The storage device is configured to store at least one first image and at least one second image. Each of the at least one first image is a two-dimensional image and includes a first cardiac pattern, and each of the at least one second image is the two-dimensional image and includes a second cardiac pattern. The processor is coupled to the storage device. The processor trains a depth learning model by using the first image. The processor analyzes the at least one second image by using the trained depth learning model to automatically evaluate a cardiac status of a user.
An embodiment of the disclosure provides a cardiac status evaluation system including an ultrasonic scanning device and an electronic device. The ultrasonic scanning device is configured to execute an ultrasonic scanning to a user to obtain at least one image. Each of the at least one image is a two-dimensional image and includes a cardiac pattern. The electronic device is coupled to the ultrasonic scanning device. The electronic device analyzes the image by using a depth learning model to automatically evaluate a cardiac status of the user.
An embodiment of the disclosure provides an ultrasonic scanning device including an ultrasonic scanner and a processor. The ultrasonic scanner is configured to execute an ultrasonic scanning to a user to obtain at least one image. Each of the at least one image is a two-dimensional image and includes a cardiac pattern. The processor is coupled to the ultrasonic scanner. The processor analyzes the image by using a depth learning model to automatically evaluate a cardiac status of the user.
According to the above description, the 2D ultrasonic image including the cardiac pattern of the user may be analyzed by the depth learning model, so as to automatically evaluate the cardiac status of the user. Moreover, the depth learning model may be trained by the 2D ultrasonic images including the cardiac patterns, so as to improve evaluation accuracy. In this way, a usage rate of 2D ultrasonic scanning devices may be effectively enhanced, so as to reduce setting cost of the ultrasonic scanning device.
To make the aforementioned more comprehensible, several embodiments accompanied with drawings are described in detail as follows.
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure.
The ultrasonic scanning device 11 is configured to execute an ultrasonic scanning on a body of a user to obtain at least one ultrasonic image reflecting a structure and/or a function of at least one body organ of the user. For example, after the user's heart is scanned by the ultrasonic scanning device 11, the ultrasonic image including a cardiac pattern is obtained. The cardiac pattern may reflect a structure and/or a function of the heart of the user. In an embodiment, the ultrasonic scanning device 11 may also be used for scanning other body parts of the user to obtain the corresponding ultrasonic images, which is not limited by the disclosure.
It should be noted that in the following embodiments, a two-dimensional (2D) ultrasonic scanning device is applied to serve as the ultrasonic scanning device 11. For example, the ultrasonic scanning device 11 may be used for executing 2D ultrasonic scanning to the body of the user to obtain a 2D ultrasonic image. However, in another embodiment, the ultrasonic scanning device 11 may also be a 3D ultrasonic scanning device, which is not limited by the disclosure.
The ultrasonic scanning device 11 may include an ultrasonic scanner 111 and a processor 112. The ultrasonic scanner 111 is configured to execute ultrasonic scanning to the body of the user. The processor 112 is coupled to the ultrasonic scanner 111. The processor 112 may be a Central Processing Unit (CPU), a graphics processor or other programmable general purpose or special purpose microprocessor, a Digital Signal Processor (DSP), a programmable controller, an Application Specific Integrated Circuits (ASIC), a Programmable Logic Device (PLD), or other similar device or a combination of the above devices.
The processor 112 may control an overall or a partial operation of the ultrasonic scanning device 11. In an embodiment, the processor 112 may control the ultrasonic scanner 111 to execute the ultrasonic scanning. In an embodiment, the processor 112 may generate an ultrasonic image according to a scanning result of the ultrasonic scanner 111.
The electronic device 12 may be a notebook computer, a desktop computer, a tablet computer, an industrial computer, a server or a smart phone, etc., that has a data transmission function, a data storage function and a data computation function. The type and the number of the electronic device 12 are not limited by the disclosure. In an embodiment, the electronic device 12 and the ultrasonic scanning device 11 may also be combined into one single device.
The electronic device 12 includes a processor 121, a storage device 122, an input/output interface 123 and a depth learning model 124. The processor 121 may be a CPU, a graphics processor or other programmable general purpose or special purpose microprocessor, a DSP, a programmable controller, an ASIC, a PLD, or other similar device or a combination of the above devices. The processor 121 may control an overall or partial operation of the electronic device 12.
The storage device 122 is coupled to the processor 121. The storage device 122 is used for storing data. For example, the storage device 122 may include a volatile storage medium and a non-volatile storage medium, where the volatile storage medium may be a Random Access Memory (RAM), and the non-volatile storage medium may be a Read Only Memory (ROM), a Solid State Drive (SSD) or a conventional hard drive.
The input/output interface 123 is coupled to the processor 121. The input/output interface 123 is used for receiving signals and/or outputting signals. For example, the input/output interface 123 may include a screen, a touch screen, a touch panel, a mouse, a keyboard, a physical key, a speaker, a microphone, a wired communication interface and/or a wireless communication interface, and the type of the input/output interface 123 is not limited thereto.
The depth learning model 124 may be implemented by software or hardware. In an embodiment, the depth learning model 124 may be implemented by a hardware circuit. For example, the depth learning model 124 may be a CPU, a graphics processor or other programmable general purpose or special purpose microprocessor, a DSP, a programmable controller, an ASIC, a PLD, or other similar device or a combination of the above devices. In an embodiment, the depth learning model 124 may be implemented by a software circuit. For example, the depth learning model 124 may be program codes stored in the storage device 122. The depth learning model 124 may be executed by the processor 121. Moreover, the depth learning model 124 may be a Convolutional Neural Networks (CNN) or other types of neural networks.
The processor 121 may train the depth learning model 124 by using the ultrasonic images 201(1)-201(N). For example, regarding the ultrasonic image 201(1), the depth learning model 124 may automatically detect an edge and/or a position of a specific part in the cardiac pattern. For example, the specific part may include a left ventricle, a right ventricle, a left atrium, a right atrium and/or a mitral valve, and the specific part may also include other portions of the heart. The depth learning model 124 may compare a detection result with a correct result to gradually improve image recognition capability. In other words, the trained depth learning model 124 may gradually increase the recognition ability for the cardiac patterns in the ultrasonic images.
The trained depth learning model 124 may be used for analyzing the ultrasonic images 301(1)-301(M). For example, the processor 121 may use the depth learning model 124 to analyze the ultrasonic images 301(1)-301(M) to automatically evaluate a cardiac status of the target user. For example, regarding the ultrasonic image 301(1), the depth learning model 124 may automatically detect an edge and/or a position of a specific part in the cardiac pattern. For example, the specific part may include a left ventricle, a right ventricle, a left atrium, a right atrium and/or a mitral valve, and the specific part may also include other portions of the heart. The processor 121 may automatically evaluate the cardiac status of the target user according to the detection result.
The processor 121 may use the depth learning model 124 to analyze the ultrasonic images 301(1)-301(M) and generate an evaluation result. The evaluation result may reflect the cardiac status of the target user. In an embodiment, the evaluation result may reflect at least one of an end-diastolic volume, an end-systolic volume, a left ventricular boundary, a maximum left ventricular boundary, a minimum left ventricular boundary, an average left ventricular boundary, and a cardiac ejection rate of the heart of the target user (which is also referred to as a target heart). In an embodiment, the evaluation result may reflect a possible physiological status of the target user in the future, for example, ventricular hypertrophy, hypertension and/or heart failure, etc. In an embodiment, the evaluation result may reflect a health status and/or possible defects of the target heart.
In an embodiment, the processor 121 may use the depth learning model 124 to analyze the ultrasonic images 301(1)-301(M) to obtain an end-diastolic volume of the target heart and an end-systolic volume of the target heart. Different combinations of the end-diastolic volume and the end-systolic volume may correspond to different cardiac statuses. The processor 121 may evaluate the cardiac status of the target user according to the end-diastolic volume of the target heart and the end-systolic volume of the target heart. For example, the processor 121 may inquire a database according to the obtained end-diastolic volume and the end-systolic volume to evaluate the cardiac status of the target user. Alternatively, the processor 121 may input the obtained end-diastolic volume and the end-systolic volume to a specific algorithm to evaluate the cardiac status of the target user.
In an embodiment, the processor 121 may use the depth learning model 124 to analyze the ultrasonic images 301(1)-301(M) to automatically detect a maximum left ventricular boundary corresponding to the second cardiac patterns and a minimum left ventricular boundary corresponding to the second cardiac patterns. Then, the processor may respectively obtain the end-diastolic volume of the target heart and the end-systolic volume of the target heart according to the maximum left ventricular boundary and the minimum left ventricular boundary.
Referring to
It should be noted that in an embodiment, the depth learning model 124 may automatically recognize a direction of a cardiac pattern in a certain ultrasonic image, for example, a frontal cardiac pattern or a lateral cardiac pattern. The depth learning model 124 may analyze the ultrasonic images 301(1)-301(M) to obtain the maximum left ventricular boundaries of the target heart and the minimum left ventricular boundaries of the target heart in at least two directions. Taking
In an embodiment, after the maximum left ventricular boundaries of the target heart and the minimum left ventricular boundaries of the target heart in at least two directions are obtained, the processor 121 may respectively obtain the end-diastolic volume of the target heart and the end-systolic volume of the target heart based on a Simpson's method. For example, the processor 121 may obtain the end-diastolic volume of the target heart or the end-systolic volume of the target heart based on the following equation (1.1).
In the equation (1.1), the parameter V is a volume of the target heart, the parameter ai is a width (for example, a short axis length) of the left ventricle in the ultrasonic image of the target heart in the first direction (for example, a front view), the parameter bi is a width (for example, a coronal plane short axis length) of the left ventricle in the ultrasonic image of the target heart in a second direction (for example, a side view), the parameter P may be 20 or other value, and the parameter L is a length (or a long axis length) of the heart. The processor 121 may automatically obtain the required parameters ai, bi and L from the ultrasonic images 301(1)-301(M) through the depth learning model 124, so as to calculate the end-diastolic volume of the target heart or the end-systolic volume of the target heart.
In other words, by performing automatic analysis of different angles on the ultrasonic images 301(1)-301(M), even if none of the ultrasonic images 301(1)-301(M) have depth information, the end-diastolic volume of the target heart and the end-systolic volume of the target heart may also be accurately evaluated. Then, the processor 121 may evaluate the cardiac status of the target user according to the end-diastolic volume and the end-systolic volume.
In an embodiment, the processor 121 may obtain a cardiac ejection rate of the target heart according to the end-diastolic volume and the end-systolic volume of the target heart. For example, the processor 121 may obtain the cardiac ejection rate of the target heart according to the following equation (1,2).
In the equation (1.2), the parameter EF represents the cardiac ejection rate of the target heart, the parameter EDV represents the end-diastolic volume of the target heart, and the parameter ESV represents the end-systolic volume of the target heart.
In an embodiment, the processor 121 may evaluate the cardiac status of the target user according to the cardiac ejection rate of the target heart, for example, the cardiac ejection rates of different value ranges may correspond to different types of the cardiac status. The processor 121 may evaluate the cardiac status of the target user according to the value range of the cardiac ejection rate of the target heart. For example, the processor 121 may look up a database according to the cardiac ejection rate of the target heart to evaluate the cardiac status of the target user. Alternatively, the processor 121 may input the obtained cardiac ejection rate into a specific algorithm to evaluate the cardiac status of the target user.
It should be noted that in the aforementioned embodiments, the operation of automatically evaluating the cardiac status of the target heart is executed by the processor 121 of the electronic device 12. However, in another embodiment, the operation of automatically evaluating the cardiac status of the target heart may also be executed by the processor 112 of the ultrasonic scanning device 11. For example, the depth learning model 124 may also be implemented in the ultrasonic scanning device 11 and executed by the processor 112. In this way, the ultrasonic scanning device 11 may automatically execute the ultrasonic scanning, the analysis of the ultrasonic images and the evaluation of the cardiac status of the target user. Related operation details have been described above, which are not repeated. Moreover, the depth learning model 124 may be trained by the processor 112 or 121, or trained by other electronic device or server, which is not limited by the disclosure.
The steps of the method of
In summary, the 2D ultrasonic images including the cardiac patterns of the user may be analyzed by the depth learning model, so as to automatically evaluate the cardiac status of the user. Moreover, the depth learning model may be trained by 2D ultrasonic images including cardiac patterns, so as to improve evaluation accuracy. In this way, a usage efficiency of 2D ultrasonic scanning devices may be effectively enhanced, so as to reduce setting cost of the ultrasonic scanning device. Moreover, the automatically evaluated cardiac status may be used as a reference for medical professionals or non-professionals.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the disclosure covers modifications and variations provided they fall within the scope of the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
107143784 | Dec 2018 | TW | national |