The present disclosure relates generally to methods and systems for identifying a heart condition in a non-human subject.
A challenge lab technicians face when reviewing diagnostic test results and imaging of non-human subjects includes manual review, human errors or bias with interpretations, and possible lengthy review of individual cases. It is estimated that some labs may spend hundreds of hours (or more) a month reading and interpreting case files to develop a diagnosis. In addition, interpretation of case files can also require telemedicine services or input by further experts, presenting a strain on resources and delaying a full and correct diagnosis of the subject.
In an example, a computer-implemented method for identifying a heart condition in a non-human subject is described comprising receiving a medical image of the non-human subject, determining, by a processor executing a first machine-learning logic and based on the medical image, a dimensional feature of a heart of the non-human subject and the first machine-learning logic is trained using medical image training data labeled with anatomical landmarks, receiving medical information associated with the non-human subject on a graphical user interface and the medical information including one or more of a species, an age, a weight, an output of a brain natriuretic peptide (BNP) test, and a murmur observation, determining, by the processor executing a second machine-learning logic and based on the medical information and the dimensional feature of the heart, a likelihood of a heart disease and the second machine-learning logic is trained using heart disease training data and combinations of which are labeled with different stages of heart disease, and displaying the likelihood of the heart disease on the graphical user interface.
In another example, a server is described comprising one or more processors, and non-transitory computer readable medium having stored therein instructions that when executed by the one or more processors, causes the server to perform functions. The functions comprises receiving a medical image of a non-human subject, determining, by the one or more processors executing a first machine-learning logic and based on the medical image, a dimensional feature of a heart of the non-human subject and the first machine-learning logic is trained using medical image training data labeled with anatomical landmarks, receiving medical information associated with the non-human subject on a graphical user interface, the medical information including one or more of a species, an age, a weight, an output of a brain natriuretic peptide (BNP) test, and a murmur observation, determining, by the one or more processors executing a second machine-learning logic and based on the medical information and the dimensional feature of the heart, a likelihood of a heart disease and the second machine-learning logic is trained using heart disease training data and combinations of which are labeled with different stages of heart disease, and displaying the likelihood of the heart disease on the graphical user interface.
In another example, a non-transitory computer readable medium is described having stored thereon instructions, that when executed by one or more processors of a computing device, cause the computing device to perform functions. The functions comprise receiving a medical image of the non-human subject, determining, by the one or more processors executing a first machine-learning logic and based on the medical image, a dimensional feature of a heart of the non-human subject and the first machine-learning logic is trained using medical image training data labeled with anatomical landmarks, receiving medical information associated with the non-human subject on a graphical user interface and the medical information including one or more of a species, an age, a weight, an output of a brain natriuretic peptide (BNP) test, and a murmur observation, determining, by the one or more processors executing a second machine-learning logic and based on the medical information and the dimensional feature of the heart, a likelihood of a heart disease and the second machine-learning logic is trained using heart disease training data and combinations of which are labeled with different stages of heart disease, and displaying the likelihood of the heart disease on the graphical user interface.
The features, functions, and advantages that have been discussed can be achieved independently in various examples or may be combined in yet other examples. Further details of the examples can be seen with reference to the following description and drawings.
Examples and descriptions of the present disclosure will be readily understood by reference to the following detailed description of illustrative examples when read in conjunction with the accompanying drawings, wherein:
Disclosed examples will now be described more fully hereinafter with reference to the accompanying drawings. Several different examples are described herein, but embodiments should not be construed as limited to the examples set forth herein. Rather, these examples are described so that this disclosure will be thorough and complete and will fully convey the scope of the disclosure to those skilled in the art.
Existing processes for review of cases files includes manual review requiring hundreds of hours of work that presents strain on resources and can have inaccurate or incomplete results. Within examples herein, systems and methods provide executable machine-learning logic for automated image analysis (e.g., measurement of anatomical features in x-rays, such as heart measurements), display of measurements and other patient details, predictive modeling providing indications of a heart condition (e.g., suggest mitral valve disease diagnosis and staging), and tailored report creation.
Example systems and methods streamline reading of heart disease cases to increase efficiency and effectiveness of identifying heart conditions. The automated measurement tools (e.g., for both radiographs and echocardiograms) generate accurate and near-instant measurements of features of the heart, and these measurements are displayed in a viewer along with other relevant patient details from demographics, history, labwork, etc. The measurements and patient details are used are inputs to executable machine-learning logic to generate predictions about whether the patient is likely to have mitral valve disease, and if so, a staging of the disease.
Thus, an example computer-implemented method for identifying a heart condition in a non-human subject includes receiving a medical image of the non-human subject, and performing the automated measurements by a processor executing a first machine-learning logic to determine a dimensional feature of a heart of the non-human subject. Such dimensional features of the heart are displayed on a graphical user interface, and then the predictive modeling is run by the processor executing a second machine-learning logic using medical information of the patient and the dimensional feature of the heart to determine a likelihood of a heart disease.
Implementations of this disclosure thus provides technological improvements that are particular to computer technology, for example, those concerning analysis of imaging of a non-human subject for automated generation of measurements of anatomical features. Computer-specific technological problems, such as executing machine-learning logic in beneficial ways, can be wholly or partially solved by implementations of this disclosure. For example, implementation of this disclosure allows for x-rays and ultrasound images of a subject to be matched to labeled images using a machine-learning logic to identify features in the images so that measurements of the features can be made.
The systems and methods of the present disclosure further address problems particular to computer devices and operation of diagnostic instruments, for example, those concerning analysis of imaging and medical information about a patient. Utilizing machine-learning algorithms, trained on manually labeled data, enables a more immediate and normalized analysis of the data. Thus, analysis of the image and medical information can occur in a manner that is efficient and takes into account all patients' needs, and a predictive modeling approach offers summaries are in a manner to be immediately determined. Implementations of this disclosure can thus introduce new and efficient improvements in the ways in which data output from diagnostic instruments is analyzed to determine a likelihood of a heart disease in a subject.
Referring now to the figures,
In addition, while the example depicted in
In the system 100, the network 104 (e.g., Internet) provides access to the server 102 for all network-connected components. In some examples, more components of the system 100 may be in communication with the network 104 to access the server 102. Communication with the server 102 and/or with the network 104 may be wired or wireless communication (e.g., some components may be in wired Ethernet communication and others may use Wi-Fi communication). In still further examples, the network 104 provides access for the server 102 to communicate with the remote computing device 106 directly as well.
The system 100 enables a method for identifying a heart condition in a non-human subject, for example. The server 102 receives a medical image of the non-human subject, such as an x-ray taken by the diagnostic testing instrument 110 and then sent by the remote computing device 106 over the network 104. The server 102 then executes machine-learning logic to determine a dimensional feature of a heart of the non-human subject on the image and executes machine-learning logic on medical information of the subject and the dimensional feature of the heart to determine a likelihood of a heart disease. In some examples, these functions are performed all by the remote computing device 106 without the aid of the server 102.
The remote computing device 106, in embodiments, includes a graphical user interface (GUI) 114 for display, which is operable to receive input data to send to the server 102 (e.g., data associated with the subject such as breed, age, etc.). The GUI 114 is thus operable to receive inputs to the remote computing device 106, and to provide an updated display that displays the dimensional feature of the heart on an interface of the GUI 114, and displays the likelihood of the heart disease on the interface of the GUI 114.
The system 100, in embodiments, includes a medical information database 116 to which the server 102 is in communication for retrieving corresponding medical information of the subject. Example medical information associated with the non-human subject that is stored in the medical information database 116 includes one or more of a species, an age, a weight, an output of a brain natriuretic peptide (BNP) test, and a murmur observation. Any information captured during a visit to the clinic by the subject can be input into the medical information database 116.
In some examples, the remote computing device 106 has access to the medical information database 116, such as via the network 104.
The server 102 includes one or more processor(s) 130, and non-transitory computer readable medium 132 having stored therein instructions 134 that when executed by the one or more processor(s) 130, causes the server 102 to perform functions for processing diagnostic test result data, as well as management and control of diagnostic testing instruments 110 (
To perform these functions, the server 102 also includes a communication interface 136, an output interface 138, and components of the server 102 are connected to a communication bus 140. The server 102 may also include hardware to enable communication within the server 102 and between the server 102 and other devices (not shown). The hardware may include transmitters, receivers, and antennas, for example. The server 102 may further include a display (not shown).
The communication interface 136 may be a wireless interface and/or one or more wireline interfaces that allow for both short-range communication and long-range communication to one or more networks or to one or more remote devices. Such wireless interfaces may provide for communication under one or more wireless communication protocols, Bluetooth, WiFi (e.g., an institute of electrical and electronic engineers (IEEE) 802.11 protocol), Long-Term Evolution (LTE), cellular communications, near-field communication (NFC), and/or other wireless communication protocols. Such wireline interfaces may include an Ethernet interface, a Universal Serial Bus (USB) interface, or similar interface to communicate via a wire, a twisted pair of wires, a coaxial cable, an optical link, a fiber-optic link, or other physical connection to a wireline network. Thus, the communication interface 136 may be configured to receive input data from one or more devices, and may also be configured to send output data to other devices.
The non-transitory computer readable medium 132 includes or takes the form of memory, such as one or more computer-readable storage media that can be read or accessed by the one or more processor(s) 130. The non-transitory computer readable medium 132 can include volatile and/or non-volatile storage components, such as optical, magnetic, organic or other memory or disc storage, which can be integrated in whole or in part with the one or more processor(s) 130. In some examples, the non-transitory computer readable medium 132 is implemented using a single physical device (e.g., one optical, magnetic, organic or other memory or disc storage unit), while in other examples, the non-transitory computer readable medium 132 is implemented using two or more physical devices. The non-transitory computer readable medium 132 thus is a computer readable storage, and the instructions 134 are stored thereon. The instructions 134 include computer executable code.
The one or more processor(s) 130 may be general-purpose processors or special purpose processors (e.g., digital signal processors, application specific integrated circuits, etc.). The one or more processor(s) 130 receive inputs from the communication interface 136 (e.g., x-ray images), and process the inputs to generate outputs that are stored in the non-transitory computer readable medium 132. The one or more processor(s) 130 are configured to execute the instructions 134 (e.g., computer-readable program instructions) that are stored in the non-transitory computer readable medium 132 and are executable to provide the functionality of the server 102 described herein.
The output interface 138 outputs information for transmission, reporting, or storage, and thus, the output interface 138 may be similar to the communication interface 136 and can be a wireless interface (e.g., transmitter) or a wired interface as well.
Within examples, the instructions 134 include specific software for performing the functions including a GUI 142, a first machine-learning logic 144, and a second machine-learning logic 146.
With respect to the GUI 142, the server 102 includes a display to display information. In examples where
The first machine-learning logic 144 and the second machine-learning logic 146 are each selectable for execution based on a type of information that is received at the server 102. In one example, each initial data received at the server 102 from the remote computing device 106 includes an identifier, for example, of a type of diagnostic testing instrument that output the data. The server 102 selects one of the first machine-learning logic 144 or the second machine-learning logic 146 for execution based on the type of diagnostic data and/or type of diagnostic testing instrument that output the data.
Although two different machine-learning logic 144 and 146 are shown, the server 102 and the instructions 134 may include more or fewer machine-learning logic for different testing instrumentation.
Each of the first machine-learning logic 144 and the second machine-learning logic 146 are trained using specific training data. As shown in
In one example operation, the server 102 receives from the remote computing device 106 (
In another example operation, the remote computing device 106 performs all functions and receives from the diagnostic testing instrument 110 a medical image of the non-human subject and determines, by the processor 130 executing the first machine-learning logic 144 and based on the medical image, a dimensional feature of a heart of the non-human subject. The remote computing device 106 then displays the dimensional feature of the heart on the GUI 114 and/or 142. Following, the remote computing device 106 receives medical information associated with the non-human subject on the GUI 114 and/or 142, such as species, an age, a weight, an output of a brain natriuretic peptide (BNP) test, and a murmur observation, and determines, by the processor 130 executing the second machine-learning logic 146 and based on the medical information and the dimensional feature of the heart, a likelihood of a heart disease. The remote computing device 106 then displays the likelihood of the heart disease on the GUI 142.
Execution of the first machine-learning logic 144 and the second machine-learning logic 146 to perform analysis of the medical imaging and information removes any human pathologist bias and generates normalized results for all inputs.
Referring to the first machine-learning logic 144 and the second machine-learning logic 146, many types of functionality and neural networks can be employed to perform functions of specific machine-learning algorithms to carry out functionality described herein. In one example, the first machine-learning logic 144 and the second machine-learning logic 146 use statistical models to generate outputs without using explicit instructions, but instead, by rely on patterns and inferences by processing associated training data.
The first machine-learning logic 144 and the second machine-learning logic 146 can thus operate according to machine-learning tasks as classified into several categories. In supervised learning, the first machine-learning logic 144 and the second machine-learning logic 146 build a mathematical model from a set of data that contains both the inputs and the desired outputs. The set of data is sample data known as the “training data”, in order to make predictions or decisions without being explicitly programmed to perform the task. For example, the first machine-learning logic 144 and the second machine-learning logic 146 utilize the medical image training data 150 and the heart disease training data 152 within comparisons to identify matches of features in received images to the labeled anatomical landmarks within a threshold. When such a match is found, the labeled training data is reference as a label to be applied to the received image data.
In another category referred to as semi-supervised learning, the first machine-learning logic 144 and the second machine-learning logic 146 develop mathematical models from incomplete training data, where a portion of the sample input does not have labels. A classification algorithm can then be used when the outputs are restricted to a limited set of values.
In another category referred to as unsupervised learning, the first machine-learning logic 144 and the second machine-learning logic 146 builds a mathematical model from a set of data that contains only inputs and no desired output labels. Unsupervised learning algorithms are used to find structure in related training data, such as grouping or clustering of data points. Unsupervised learning can discover patterns in data, and can group the inputs into categories.
Alternative machine-learning algorithms may be used to learn and classify types of images and medical information to consider for generating the prediction of a heart disease, such as deep learning though neural networks or generative models. Deep machine-learning may use neural networks to analyze prior test results through a collection of interconnected processing nodes. The connections between the nodes may be dynamically weighted. Neural networks learn relationships through repeated exposure to data and adjustment of internal weights. Neural networks may capture nonlinearity and interactions among independent variables without pre-specification. Whereas traditional regression analysis requires that nonlinearities and interactions be detected and specified manually, neural networks perform the tasks automatically.
Still other machine-learning algorithms or functions can be implemented to identify anatomical features in images and determine a prediction of a heart disease, such as any number of classifiers that receives input parameters and outputs a classification (e.g., attributes of the image). Support vector machine, Bayesian network, a probabilistic boosting tree, neural network, sparse auto-encoding classifier, or other known or later developed machine-learning algorithms or any suitable combination thereof may be used. Any semi-supervised, supervised, or unsupervised learning may be used. Hierarchal, cascade, or other approaches may be also used.
The first machine-learning logic 144 and the second machine-learning logic 146 may thus be considered an application of rules in combination with learning from prior labeled data to identify appropriate outputs. Analyzing and relying on prior labeled data allows the first machine-learning logic 144 and the second machine-learning logic 146 to apply patterns of images and medical information associated with a likelihood of heart disease, for example.
Thus, the first machine-learning logic 144 and the second machine-learning logic 146 take the form of one or a combination of any of the machine-learning algorithms described herein, for example.
Many different types of measurements are useful as factors to consider for identifying a heart condition. Example dimensional features of the heart to determine based on measurement calculations include a left atrial to aortic ratio (LA/Ao) measurement, a left ventricular internal diameter at end-diastole (LVIDdN) measurement, and a ventricular left atrial size (VLAS) measurement. Other dimensional features of the heart to determine based on measurement calculations can include a vertebral heart scale (VHS) and an atrial measurement, for example.
The determined dimensional features are input into the heart disease predictive model 168, and functions of the heart disease predictive model 168 are performed by the processor 130 executing the second machine-learning logic 146. The heart disease predictive model 168 is trained on dimensional data (e.g., LA/Ao, LVIDdN, VLAS, etc.) to predict different stages of heart disease per patient using additional patient data (e.g., age, body size, breed, etc.).
An output of the heart disease predictive model 168 is a prediction of a likelihood of a heart disease by the patient, such as a likelihood of a mitral valve disease and a stage of the heart disease. The output is provided in a report 170 that includes a summary of pertinent data analyzed by the heart disease predictive model 168 (e.g., dimensional measurements, patient data, etc.) along with a VHS. The report 170 is provided in the GUI 142 (
To make the measurements shown in
In
In
Referring back to
Data input to the heart predictive model 168 can thus include species (e.g., canine), age (e.g., young or old (e.g., >4 years), body size (e.g., small breed <20 kg), murmur (e.g., none, soft, loud), BNP (e.g., less than or greater than 900), VHS (less than 11.1/11.5 or higher), VLAS (less than 2.3; 2.3-2.8; or greater than 2.8), LA/Ao (less than or greater than 1.6), LViDdN (less than or greater than 1.7), and echo visualization of deformed valves/documentation of regurgitation with Doppler.
An output of the predictive model 168 is a summary report with a predictive analysis. Thus, a structured report is output for a mitral valve disease dog using automated VHS and adding VHS to the report so parameters are all shown, and the same can be performed with VLAS to be used in combination with age, weight, etc., to generate a complete report for a predictive diagnosis of a B1 or B2 mitral valve disease dog. Within the report 170, the measurements are pre-populated and access to the images used to calculate the measurements are enabled through drop-down menus where the measurements are shown with graphical overlays (e.g., as illustrated in
An example of a predictive analysis includes a patient that is a small breed dog with a loud heart murmur, and the VHS and VLAS is obtained from radiographs and/or left atrial and left ventricular size from echo images. From patient data and these measurements, the processor 130 executes the second machine-learning logic 146 to generate a structured report including a prediction of a likelihood of heart disease and a disease stage, where information in the report is populated.
Within examples, different heart diseases are evaluated within the heart disease predictive model 168. Examples of heart diseases considered include mitral valve heart disease and dilated cardiomyopathy (DCM). For instance, for DCM, the heart disease prediction considers body weight, murmur, VHS, VLAS, LA/Ao, and Left Ventricular Internal Diameter normalized dimensions (LVIDDn), and DCM is diagnosed when body weight is greater than 20 kg, a murmur is greater than grade 3, and fractional shortening (part of the LV echo measurement tool) is less than 20%.
For mitral valve heart disease, in addition to making a prediction of a likelihood of the subject having mitral valve disease, the second-machine learning logic 146 is executable to estimate a stage of the disease. Parameters considered in the heart disease training data 152 for determination of different stages include: Stage A (subject at high risk for developing heart disease); Stage B1 (subject with valvular regurgitation but no cardiac dilation); Stage B2 (subject with valvular regurgitation and cardiac dilation); Stage C (subject with valvular regurgitation, cardiac dilation and congestive heart failure); Stage D (stage C subject with recurrent/refractory heart failure). Thus, when displaying the likelihood of the heart disease on the GUI 142 (
It should be understood that for this and other processes and methods disclosed herein, flowcharts show functionality and operation of one possible implementation of present examples. In this regard, each block or portions of each block may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor for implementing specific logical functions or steps in the process. The program code may be stored on any type of computer readable medium or data storage, for example, such as a storage device including a disk or hard drive. Further, the program code can be encoded on a computer-readable storage media in a machine-readable format, or on other non-transitory media or articles of manufacture. The computer readable medium may include non-transitory computer readable medium or memory, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and Random Access Memory (RAM). The computer readable medium may also include non-transitory media, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example. The computer readable media may also be any other volatile or non-volatile storage systems. The computer readable medium may be considered a tangible computer readable storage medium, for example.
In addition, each block or portions of each block in
At block 202, the method 200 includes receiving a medical image of the non-human subject. In one example, receiving the medical image comprises receiving one or more of an echocardiogram, an x-ray, a radiograph, and an ultrasound image.
At block 204, the method 200 includes determining, by a processor executing a first machine-learning logic and based on the medical image, a dimensional feature of a heart of the non-human subject and the first machine-learning logic is trained using medical image training data labeled with anatomical landmarks. In one example, determining the dimensional feature of the heart comprises determining features in the medical image matching to the anatomical landmarks in the labeled training data, and performing a measurement calculation between selected features in the medical image. Within examples, the dimensional feature of the heart includes one or more of a vertebral heart score, an atrial measurement, a ventricular left atrial size (VLAS) measurement, a vertebral heart scale (VHS), a left atrial to aortic ratio (LA/Ao) measurement, and a left ventricular internal diameter at end-diastole (LVIDdN) measurement. Within examples, the vertebral heart score, vertebral heart size, and the VHS are considered the same metric based on a viewpoint of the measurements being made, and reference to these terms herein refers to the same metric.
At block 206, the method 200 optionally includes displaying the dimensional feature of the heart on a graphical user interface. In one example, displaying the dimensional feature of the heart on the graphical user interface comprises displaying the dimensional feature of the heart as a digital graphic overlaying the medical image on the graphical user interface.
At block 208, the method 200 includes receiving medical information associated with the non-human subject on the graphical user interface, the medical information including one or more of a species, an age, a weight, an output of a brain natriuretic peptide (BNP) test, and a murmur observation.
At block 210, the method 200 includes determining, by the processor executing a second machine-learning logic and based on the medical information and the dimensional feature of the heart, a likelihood of a heart disease, and the second machine-learning logic is trained using heart disease training data and combinations of which are labeled with different stages of heart disease. In one example, determining the likelihood of the heart disease comprises determining a likelihood of a mitral valve disease, and determining a stage of the heart disease.
At block 212, the method 200 includes displaying the likelihood of the heart disease on the graphical user interface. In one example, displaying the likelihood of the heart disease on the graphical user interface comprises displaying a prediction of a stage of the heart disease, displaying a summary of the dimensional feature of the heart, and displaying interactive menus selectable to modify the graphical user interface to display a graph for trend analysis.
Within example systems and methods described herein, advantages and improvements over existing methods include utilization of normalized measurement processes (in contrast to manual use of a basic ruler tool for measuring dimensions of the heart, for which many variations are possible due to different ruler tools and user bias), integration of electronic information from several different sources such as patient demographics, case history, bloodwork results and the image viewer, and a highly detailed structured report including individual transcription of cardiac measurements.
With reference to
The description of the different advantageous arrangements has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the examples in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. Further, different advantageous examples may describe different advantages as compared to other advantageous examples. The example or examples selected are chosen and described in order to explain the principles of the examples, the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various examples with various modifications as are suited to the particular use contemplated.
Different examples of the system(s), device(s), and method(s) disclosed herein include a variety of components, features, and functionalities. It should be understood that the various examples of the system(s), device(s), and method(s) disclosed herein may include any of the components, features, and functionalities of any of the other examples of the system(s), device(s), and method(s) disclosed herein in any combination or any sub-combination, and all of such possibilities are intended to be within the scope of the disclosure.
Thus, examples of the present disclosure relate to enumerated clauses (ECs) listed below in any combination or any sub-combination.
EC 1 is a computer-implemented method for identifying a heart condition in a non-human subject, the method comprising: receiving a medical image of the non-human subject; determining, by a processor executing a first machine-learning logic and based on the medical image, a dimensional feature of a heart of the non-human subject, wherein the first machine-learning logic is trained using medical image training data labeled with anatomical landmarks; receiving medical information associated with the non-human subject on a graphical user interface, the medical information including one or more of a species, an age, a weight, an output of a brain natriuretic peptide (BNP) test, and a murmur observation; determining, by the processor executing a second machine-learning logic and based on the medical information and the dimensional feature of the heart, a likelihood of a heart disease, wherein the second machine-learning logic is trained using heart disease training data and combinations of which are labeled with different stages of heart disease; and displaying the likelihood of the heart disease on the graphical user interface.
EC 2 is the method of EC 1, wherein: said receiving of the medical image comprises receiving one or more of an echocardiogram, an x-ray, a radiograph, and an ultrasound image.
EC 3 is the method of any of ECs 1-2, wherein said determining of the dimensional feature of the heart comprises: determining features in the medical image matching to the anatomical landmarks in the labeled training data; and performing a measurement calculation between selected features in the medical image.
EC 4 is the method of any of ECs 1-3, wherein said determining of the dimensional feature of the heart comprises determining a vertebral heart score.
EC 5 is the method of any of ECs 1-4, wherein said determining of the dimensional feature of the heart comprises determining an atrial measurement.
EC 6 is the method of any of ECs 1-5, wherein said determining of the dimensional feature of the heart comprises determining a ventricular left atrial size (VLAS) measurement.
EC 7 is the method of any of ECs 1-6, wherein said determining of the dimensional feature of the heart comprises determining a vertebral heart scale (VHS).
EC 8 is the method of any of ECs 1-7, wherein said receiving of the medical image comprises receiving a radiograph and an echocardiogram, and wherein said determining the dimensional feature of the heart comprises: determining a ventricular left atrial size (VLAS) measurement based on the radiograph; determining a left atrial to aortic ratio (LA/Ao) measurement based on the echocardiogram; and determining a left ventricular internal diameter at end-diastole (LVIDdN) measurement based on the echocardiogram.
EC 9 is the method of any of ECs 1-8, further comprising displaying the dimensional feature of the heart on a graphical user interface as a digital graphic overlaying the medical image on the graphical user interface.
EC 10 is the method of any of ECs 1-9, wherein said determining of the likelihood of the heart disease comprises determining a likelihood of a mitral valve disease.
EC 11 is the method of any of ECs 1-10, wherein said determining of the likelihood of the heart disease comprises determining a stage of the heart disease.
EC 12 is the method of any of ECs 1-11, wherein said displaying of the likelihood of the heart disease on the graphical user interface comprises: displaying a prediction of a stage of the heart disease; displaying a summary of the dimensional feature of the heart; and displaying interactive menus selectable to modify the graphical user interface to display a graph for trend analysis.
EC 13 is a server comprising: one or more processors; and non-transitory computer readable medium having stored therein instructions that when executed by the one or more processors, causes the server to perform functions comprising: receiving a medical image of a non-human subject; determining, by the one or more processors executing a first machine-learning logic and based on the medical image, a dimensional feature of a heart of the non-human subject, wherein the first machine-learning logic is trained using medical image training data labeled with anatomical landmarks; receiving medical information associated with the non-human subject on a graphical user interface, the medical information including one or more of a species, an age, a weight, an output of a brain natriuretic peptide (BNP) test, and a murmur observation; determining, by the one or more processors executing a second machine-learning logic and based on the medical information and the dimensional feature of the heart, a likelihood of a heart disease, wherein the second machine-learning logic is trained using heart disease training data and combinations of which are labeled with different stages of heart disease; and displaying the likelihood of the heart disease on the graphical user interface.
EC 14 is the server of EC 13, wherein said receiving of the medical image comprises receiving one or more of an echocardiogram, an x-ray, a radiograph, and an ultrasound image.
EC 15 is the server of any of ECs 13-14, wherein said determining of the dimensional feature of the heart comprises: determining features in the medical image matching to the anatomical landmarks in the labeled training data; and performing a measurement calculation between selected features in the medical image.
EC 16 is the server of any of ECs 13-15, wherein said receiving of the medical image comprises receiving a radiograph and an echocardiogram, and wherein said determining the dimensional feature of the heart comprises: determining a ventricular left atrial size (VLAS) measurement based on the radiograph; determining a left atrial to aortic ratio (LA/Ao) measurement based on the echocardiogram; and determining a left ventricular internal diameter at end-diastole (LVIDdN) measurement based on the echocardiogram.
EC 17 is a non-transitory computer readable medium having stored thereon instructions, that when executed by one or more processors of a computing device, cause the computing device to perform functions comprising: receiving a medical image of the non-human subject; determining, by the one or more processors executing a first machine-learning logic and based on the medical image, a dimensional feature of a heart of the non-human subject, wherein the first machine-learning logic is trained using medical image training data labeled with anatomical landmarks; receiving medical information associated with the non-human subject on a graphical user interface, the medical information including one or more of a species, an age, a weight, an output of a brain natriuretic peptide (BNP) test, and a murmur observation; determining, by the one or more processors executing a second machine-learning logic and based on the medical information and the dimensional feature of the heart, a likelihood of a heart disease, wherein the second machine-learning logic is trained using heart disease training data and combinations of which are labeled with different stages of heart disease; and displaying the likelihood of the heart disease on the graphical user interface.
EC 18 is the non-transitory computer readable medium of EC 17, wherein said determining of the dimensional feature of the heart comprises: determining features in the medical image matching to the anatomical landmarks in the labeled training data; and performing a measurement calculation between selected features in the medical image.
EC 19 is the non-transitory computer readable medium of any of ECs 17-18, wherein said receiving of the medical image comprises receiving a radiograph and an echocardiogram, and wherein said determining the dimensional feature of the heart comprises: determining a ventricular left atrial size (VLAS) measurement based on the radiograph; determining a left atrial to aortic ratio (LA/Ao) measurement based on the echocardiogram; and determining a left ventricular internal diameter at end-diastole (LVIDdN) measurement based on the echocardiogram.
EC 20 is the non-transitory computer readable medium of any of ECs 17-19, wherein functions further comprise displaying the dimensional feature of the heart on the graphical user interface as a digital graphic overlaying the medical image on the graphical user interface.
By the term “substantially” and “about” used herein, it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide. The terms “substantially” and “about” represent the inherent degree of uncertainty that may be attributed to any quantitative comparison, value, measurement, or other representation. The terms “substantially” and “about” are also utilized herein to represent the degree by which a quantitative representation may vary from a stated reference without resulting in a change in the basic function of the subject matter at issue.
It is noted that one or more of the following claims utilize the term “wherein” as a transitional phrase. For the purposes of defining the present invention, it is noted that this term is introduced in the claims as an open-ended transitional phrase that is used to introduce a recitation of a series of characteristics of the structure and should be interpreted in like manner as the more commonly used open-ended preamble term “comprising.”
The present disclosure claims priority to U.S. provisional application No. 63/489,220, filed on Mar. 9, 2023, the entire contents of which are herein incorporated by reference.
Number | Date | Country | |
---|---|---|---|
63489220 | Mar 2023 | US |