N/A
Currently the assessment of motility disorders of the esophagus is focused on using a transnasal catheter to perform pressure assessment while the patient is awake. The functional lumen imaging probe (“FLIP”) was developed to circumvent the problem of having patients do this procedure while they are awake and unsedated. A FLIP utilizes high-resolution impedance planimetry to measure luminal dimensions during controlled, volumetric distension of a balloon-positioned within the esophagus. Esophageal contractility can be elicited by FLIP distension and identified when esophageal diameter changes are depicted as a function of time. FLIP can therefore detect esophageal contractions that both occlude and do not occlude the esophageal lumen (i.e. non-occluding contractions).
Unfortunately, the FLIP technology lacks a validated analysis platform, and diagnosis is made loosely based on pattern recognition and a few numerical measures of distensibility. There remains a need for a tool that can help the clinician diagnose major motor disorders and normal function based on FLIP data.
The present disclosure addresses the aforementioned drawbacks by providing a method for generating classified feature data indicative of an upper gastrointestinal disorder in a subject based on esophageal measurement data acquired from the subject's esophagus. The method includes accessing esophageal measurement data with a computer system, where the esophageal measurement data comprise measurements of pressure within the subject's esophagus and changes in a geometry of the subject's esophagus. A trained machine learning algorithm is also accessed with the computer system, where the trained machine learning algorithm has been trained on training data in order to generate classified feature data from esophageal measurement data. The esophageal measurement data are applied to the trained machine learning algorithm using the computer system, generating output as classified feature data that classify the esophageal measurement data as being indicative of an upper gastrointestinal disorder in the subject.
It is another aspect of the present disclosure to provide a method for generating a report that classifies an upper gastrointestinal disorder in a subject. The method includes accessing functional lumen imaging probe (FLIP) data with a computer system, where the FLIP data depict esophageal pressure and diameter measurements in the subject's esophagus. A trained classification algorithm is also accessed with the computer system. Classified feature data are generated with the computer system by inputting the FLIP data to the trained classification algorithm, generating output as the classified feature data, wherein the classified feature data classify the FLIP data as being indicative of an upper gastrointestinal disorder in the subject. A report is then generated from the classified feature data using the computer system, where the report indicates a classification of the FLIP data being indicative of the upper gastrointestinal disorder in the subject.
The foregoing and other aspects and advantages of the present disclosure will appear from the following description. In the description, reference is made to the accompanying drawings that form a part hereof, and in which there is shown by way of illustration a preferred embodiment. This embodiment does not necessarily represent the full scope of the invention, however, and reference is therefore made to the claims and herein for interpreting the scope of the invention.
Described here are systems and methods for classifying upper gastrointestinal (“UGI”) data, which may include manometry data, panometry data, and/or other data acquired from a subject's UGI tract or a portion thereof (e.g., the subject's esophagus) using, for example, a functional lumen imaging probe (“FLIP”) or other measurement device. The systems and methods described in the present disclosure implement classification algorithms, machine learning algorithms, or combinations thereof, in order to classify these data. For instance, patterns in the input data can be identified and classified using one or more classification and/or machine learning algorithms.
In general, the systems and methods described in the present disclosure provide an artificial intelligence (“AI”) methodology to classify esophageal measurement data into relevant pathologic groups, including esophageal measurement data acquired from functional lumen imaging for esophageal function testing. In some embodiments, the classification may be a binary classification, in which the esophageal measurement data are classified into one of two categories or class labels (e.g., “normal” and “abnormal”). In these instances, classification algorithms including logistic regression, k-nearest neighbors, decision trees, support vector machines, Naive Bayes, and/or artificial neural networks can be implemented.
In some other embodiments, the classification may be a multiclass classification, in which the esophageal measurement data are classified into more than two categories or class labels (e.g., “normal,” “abnormal-not achalasia,” and “abnormal-achalasia”). In these instances, classification algorithms including k-nearest neighbors, decision trees, Naive Bayes, random forest, gradient boosting, and/or artificial neural networks (e.g., convolutional neural networks) can be implemented.
In still other embodiments, the classification may be a multilabel classification, in which the esophageal measurement data are classified into two or more categories or class labels, and where two or more class labels can be predicted for each data sample. For example, a data sample may be classified as “normal” or “abnormal” and an “abnormal” class may be additionally classified as “not achalasia” or “achalasia.” In these instances, classification algorithms including multi-label decision trees, multi-label random forests, multi-label gradient boosting, and/or artificial neural networks (e.g., convolutional neural networks) can be implemented.
In one example, a neural network, such as a convolutional neural network, that is focused on heat maps estimated, computed, or otherwise determined from esophageal measurement data can be used to classify the esophageal measurement data into one of three distinct patterns: normal, abnormal-not achalasia, and abnormal-achalasia. Classifying patients into one of these three groups can help inform a clinician's decision for treatment and management.
The following acronyms, used throughout the present disclosure, have the associated definition given in the table below, although other acronyms may be introduced in the detailed description:
Referring now to
Additionally or alternatively, in some embodiments, the computing device 150 can communicate information about data received from the esophageal measurement data source 102 to a server 152 over a communication network 154, which can execute at least a portion of the UGI classification system 104. In such embodiments, the server 152 can return information to the computing device 150 (and/or any other suitable computing device) indicative of an output of the UGI classification system 104.
In some embodiments, computing device 150 and/or server 152 can be any suitable computing device or combination of devices, such as a desktop computer, a laptop computer, a smartphone, a tablet computer, a wearable computer, a server computer, a virtual machine being executed by a physical computing device, and so on.
In some embodiments, esophageal measurement data source 102 can be any suitable source of data (e.g., measurement data, manometry data, panometry data, FLIP data, images or maps reconstructed from such data), such as a functional lumen imaging probe or other suitable imaging or functional measurement device, another computing device (e.g., a server storing data), and so on. In some embodiments, esophageal measurement data source 102 can be local to computing device 150. For example, esophageal measurement data source 102 can be incorporated with computing device 150 (e.g., computing device 150 can be configured as part of a device for capturing, scanning, and/or storing data). As another example, esophageal measurement data source 102 can be connected to computing device 150 by a cable, a direct wireless link, and so on. Additionally or alternatively, in some embodiments, esophageal measurement data source 102 can be located locally and/or remotely from computing device 150, and can communicate data to computing device 150 (and/or server 152) via a communication network (e.g., communication network 154).
In some embodiments, communication network 154 can be any suitable communication network or combination of communication networks. For example, communication network 154 can include a Wi-Fi network (which can include one or more wireless routers, one or more switches, etc.), a peer-to-peer network (e.g., a Bluetooth network), a cellular network (e.g., a 3G network, a 4G network, etc., complying with any suitable standard, such as CDMA, GSM, LTE, LTE Advanced, WiMAX, etc.), a wired network, and so on. In some embodiments, communication network 154 can be a local area network, a wide area network, a public network (e.g., the Internet), a private or semi-private network (e.g., a corporate or university intranet), any other suitable type of network, or any suitable combination of networks. Communications links shown in
Referring now to
In some embodiments, communications systems 208 can include any suitable hardware, firmware, and/or software for communicating information over communication network 154 and/or any other suitable communication networks. For example, communications systems 208 can include one or more transceivers, one or more communication chips and/or chip sets, and so on. In a more particular example, communications systems 208 can include hardware, firmware and/or software that can be used to establish a Wi-Fi connection, a Bluetooth connection, a cellular connection, an Ethernet connection, and so on.
In some embodiments, memory 210 can include any suitable storage device or devices that can be used to store instructions, values, data, or the like, that can be used, for example, by processor 202 to present content using display 204, to communicate with server 152 via communications system(s) 208, and so on. Memory 210 can include any suitable volatile memory, non-volatile memory, storage, or any suitable combination thereof. For example, memory 210 can include RAM, ROM, EEPROM, one or more flash drives, one or more hard disks, one or more solid state drives, one or more optical drives, and so on. In some embodiments, memory 210 can have encoded thereon, or otherwise stored therein, a computer program for controlling operation of computing device 150. In such embodiments, processor 202 can execute at least a portion of the computer program to present content (e.g., images, heat maps, user interfaces, graphics, tables), receive content from server 152, transmit information to server 152, and so on.
In some embodiments, server 152 can include a processor 212, a display 214, one or more inputs 216, one or more communications systems 218, and/or memory 220. In some embodiments, processor 212 can be any suitable hardware processor or combination of processors, such as a CPU, a GPU, and so on. In some embodiments, display 214 can include any suitable display devices, such as a computer monitor, a touchscreen, a television, and so on. In some embodiments, inputs 216 can include any suitable input devices and/or sensors that can be used to receive user input, such as a keyboard, a mouse, a touchscreen, a microphone, and so on.
In some embodiments, communications systems 218 can include any suitable hardware, firmware, and/or software for communicating information over communication network 154 and/or any other suitable communication networks. For example, communications systems 218 can include one or more transceivers, one or more communication chips and/or chip sets, and so on. In a more particular example, communications systems 218 can include hardware, firmware and/or software that can be used to establish a Wi-Fi connection, a Bluetooth connection, a cellular connection, an Ethernet connection, and so on.
In some embodiments, memory 220 can include any suitable storage device or devices that can be used to store instructions, values, data, or the like, that can be used, for example, by processor 212 to present content using display 214, to communicate with one or more computing devices 150, and so on. Memory 220 can include any suitable volatile memory, non-volatile memory, storage, or any suitable combination thereof. For example, memory 220 can include RAM, ROM, EEPROM, one or more flash drives, one or more hard disks, one or more solid state drives, one or more optical drives, and so on. In some embodiments, memory 220 can have encoded thereon a server program for controlling operation of server 152. In such embodiments, processor 212 can execute at least a portion of the server program to transmit information and/or content (e.g., data, images, a user interface) to one or more computing devices 150, receive information and/or content from one or more computing devices 150, receive instructions from one or more devices (e.g., a personal computer, a laptop computer, a tablet computer, a smartphone), and so on.
In some embodiments, esophageal measurement data source 102 can include a processor 222, one or more inputs 224, one or more communications systems 226, and/or memory 228. In some embodiments, processor 222 can be any suitable hardware processor or combination of processors, such as a CPU, a GPU, and so on. In some embodiments, the one or more inputs 224 are generally configured to acquire data and can include a functional lumen imaging probe. Additionally or alternatively, in some embodiments, one or more inputs 224 can include any suitable hardware, firmware, and/or software for coupling to and/or controlling operations of a functional lumen imaging probe. In some embodiments, one or more portions of the one or more inputs 224 can be removable and/or replaceable.
Note that, although not shown, esophageal measurement data source 102 can include any suitable inputs and/or outputs. For example, esophageal measurement data source 102 can include input devices and/or sensors that can be used to receive user input, such as a keyboard, a mouse, a touchscreen, a microphone, a trackpad, a trackball, and so on. As another example, esophageal measurement data source 102 can include any suitable display devices, such as a computer monitor, a touchscreen, a television, etc., one or more speakers, and so on.
In some embodiments, communications systems 226 can include any suitable hardware, firmware, and/or software for communicating information to computing device 150 (and, in some embodiments, over communication network 154 and/or any other suitable communication networks). For example, communications systems 226 can include one or more transceivers, one or more communication chips and/or chip sets, and so on. In a more particular example, communications systems 226 can include hardware, firmware and/or software that can be used to establish a wired connection using any suitable port and/or communication standard (e.g., VGA, DVI video, USB, RS-232, etc.), Wi-Fi connection, a Bluetooth connection, a cellular connection, an Ethernet connection, and so on.
In some embodiments, memory 228 can include any suitable storage device or devices that can be used to store instructions, values, data, or the like, that can be used, for example, by processor 222 to control the one or more inputs 224; to receive data from the one or more inputs 224; to generate images, heat maps, and/or computed parameters from data; to present content (e.g., images, heat maps, a user interface) using a display; to communicate with one or more computing devices 150; and so on. Memory 228 can include any suitable volatile memory, non-volatile memory, storage, or any suitable combination thereof. For example, memory 228 can include RAM, ROM, EEPROM, one or more flash drives, one or more hard disks, one or more solid state drives, one or more optical drives, and so on. In some embodiments, memory 228 can have encoded thereon, or otherwise stored therein, a program for controlling operation of esophageal measurement data source 102. In such embodiments, processor 222 can execute at least a portion of the program to compute parameters, transmit information and/or content (e.g., data, images, heat maps) to one or more computing devices 150, receive information and/or content from one or more computing devices 150, receive instructions from one or more devices (e.g., a personal computer, a laptop computer, a tablet computer, a smartphone, etc.), and so on.
In some embodiments, any suitable computer readable media can be used for storing instructions for performing the functions and/or processes described herein. For example, in some embodiments, computer readable media can be transitory or non-transitory. For example, non-transitory computer readable media can include media such as magnetic media (e.g., hard disks, floppy disks), optical media (e.g., compact discs, digital video discs, Blu-ray discs), semiconductor media (e.g., random access memory (“RAM”), flash memory, electrically programmable read only memory (“EPROM”), electrically erasable programmable read only memory (“EEPROM”)), any suitable media that is not fleeting or devoid of any semblance of permanence during transmission, and/or any suitable tangible media. As another example, transitory computer readable media can include signals on networks, in wires, conductors, optical fibers, circuits, or any suitable media that is fleeting and devoid of any semblance of permanence during transmission, and/or any suitable intangible media.
Referring now to
The method includes accessing esophageal measurement data or other UGI measurement data with a computer system, as indicated at step 302. For instance, the computing device 150 (or the server 152) can access the esophageal measurement data from the esophageal measurement data source 102 through either a wired connection or a wireless connection, as described above. In some embodiments, the esophageal measurement data can include measurement data indicating measurements of one or more characteristics of the UGI tract, such as pressure and/or geometry (e.g., lumen diameter or other geometric measurements). For example, the esophageal measurement data can include measurements of pressure and/or geometry of the subject's UGI tract or a portion thereof (e.g., the esophagus).
As one non-limiting example, esophageal measurement data are esophageal measurement data that indicate measurements of pressure and/or geometry of the subject's esophagus. The esophageal pressure and geometry data can be FLIP data acquired from the subject's esophagus using a FLIP system, and may include in a non-limiting example, measurement data such as pump status (e.g., inflated, deflated, or stopped), readings from the sensor pairs on the catheter balloon that indicate the diameter of the lumen, balloon pressure, balloon volume, and/or balloon temperature. Additionally or alternatively, the esophageal pressure and geometry data may include other manometry, planimetry, and/or panometry data. The esophageal pressure and geometry data may include measurement values or plots or measurement values. In some instances, the esophageal pressure and geometry data may include two-dimensional images, or heat maps, that depict a spatial and/or spatiotemporal distribution of esophageal pressure and/or geometric measurement values.
Accessing the esophageal measurement data may include retrieving such data from a memory or other suitable data storage device or medium. Alternatively, accessing the input data may include acquiring such data with a suitable measurement device, such as a functional lumen imaging probe, and transferring or otherwise communicating the data to the computer system.
In some instances, the esophageal measurement data can include measurements of esophageal pressure and/or geometry that may include artifacts, such as artifacts related to the diameter measured during periods of strong esophageal contraction. During contractions where the lumen is occluded, the measurements may be negated as the contraction can interrupt the flow of current within the catheter. These artifacts can therefore be detected in the data, and the data processed accordingly to remove the artifacts.
The esophageal measurement data are then input to an AI-based classifier, generating output as classified feature data, as indicated at step 304. For instance, the processor 202 of the computing device 150 (or the processor 212 of the server 152) receives the esophageal measurement data and provides the esophageal measurement data as input data to an AI-based classifier executed by the processor 202 (or processor 212), generating output data as the classified feature data. The AI-based classifier can be implemented by the processor 202 executing an AI classifier program, algorithm, or model stored in the memory 210 of the computer device 150, or alternatively by the processor 212 executing an AI classifier program, algorithm, or model stored in the memory 220 of the server 152. For example, the AI classifier program, algorithm, or model executing on the processor 202 (or processor 212) processes (e.g., classifies according to one of the machine learning and/or artificial intelligence algorithms described in the present disclosure) the received esophageal measurement data and generates an output as the classified feature data.
The classified feature data may include a classification of the subject as belonging to a particular classification of upper gastrointestinal disorder, a quantifiable probability score of the subject belonging to one or more upper gastrointestinal disorders, and the like. As one example, the classified feature data may indicate the probability for a particular classification (i.e., the probability that a subject belongs to a particular class), such as normal, abnormal-not achalasia, and abnormal-achalasia.
In some embodiments, the computing device 150 and/or server 152 may store a selection of various AI-based classifiers, in which each AI-based classifier is specifically configured to perform a different classification task. In such embodiments, the user may select which of the AI-based classifiers to implement with the computing device 150 and/or server 152. For example, the computing device 150 or another external device (e.g., a smartphone, a tablet computer, a cellular phone, a laptop computer, a smart watch, and the like) may provide a graphical interface that allows the user to select a type of AI-based classifier. A user may select the AI-based classifier based on, for example, the type of esophageal measurement data available for the subject.
As described above, the AI-based classifier may implement any number of suitable AI classification programs, algorithms, and/or models, including logistic regression, k-nearest neighbors, decision trees, support vector machines, Naive Bayes, random forest, gradient boosting, and/or artificial neural networks (e.g., convolutional neural networks).
In some embodiments, more than one AI-based classifier can be implemented to process the esophageal measurement data. For example, esophageal measurement data can be input to a first AI-based classifier to generate output as first classified feature data. The esophageal measurement data, first classified feature data, or both, can then be input to a second AI-based classifier to generate output as second classified feature data. The first classified feature data may indicate the presence of one or more contractile patterns in the esophageal measurement data, as an example. The presence and/or identification of these contractile patterns can be used as an input to a second AI-based classifier, in addition to other esophageal measurement data or other data (e.g., parameters that are computed or estimated from esophageal measurement data). The second classified feature data can then indicate a classification of the esophageal measurement data as indicating a particular condition, such as a normal condition, an abnormal but inconclusive for achalasia condition, or an achalasia condition.
The classified feature data generated by processing the esophageal measurement data using the processor 202 and/or processor 212 executing an AI-based classifier can then be displayed to a user, stored for later use or further processing, or both, as indicated at step 306. For example, the classified feature data may be stored locally by the computer device 150 (e.g., in the memory 210) or displayed to the user via the display 204 of the computing device 150. Additionally or alternatively, the classified feature data may be stored in the memory 220 of the server 152 and/or displayed to a user via the display 214 of the server 152. In still other embodiments, the classified feature data may be stored in a memory or other data storage device or medium other than those associated with the computing device 150 or server 152. In these instances, the classified feature data can be transmitted to such other devices using the communication network 154 or other wired or wireless communication links.
In one example, the computer system (e.g., computing device 150, server 152) implements an artificial neural network for the AI-based classifier. The artificial neural network generally includes an input layer, one or more hidden layers or nodes, and an output layer. Typically, the input layer includes as many nodes as inputs provided to the computer system. As described above, the number (and the type) of inputs provided to the computer system may vary based on the particular task for the AI-based classifier. Accordingly, the input layer of the artificial neural network may have a different number of nodes based on the particular task for the AI-based classifier.
In some embodiments, the input to the AI-based classifier may include esophageal measurement data such as pump status (e.g., inflated, deflated, or stopped), readings from the sensor pairs on the catheter balloon that indicate the diameter of the lumen, balloon pressure, balloon volume, and/or balloon temperature, which may be measured with a FLIP system or other suitable measurement system or device.
The input layer connects to the one or more hidden layers. The number of hidden layers varies and may depend on the particular task for the AI-based classifier. Additionally, each hidden layer may have a different number of nodes and may be connected to the next layer differently. For example, each node of the input layer may be connected to each node of the first hidden layer. The connection between each node of the input layer and each node of the first hidden layer may be assigned a weight parameter. Additionally, each node of the neural network may also be assigned a bias value. However, each node of the first hidden layer may not be connected to each node of the second hidden layer. That is, there may be some nodes of the first hidden layer that are not connected to all of the nodes of the second hidden layer. The connections between the nodes of the first hidden layers and the second hidden layers are each assigned different weight parameters. Each node of the hidden layer is associated with an activation function. The activation function defines how the hidden layer is to process the input received from the input layer or from a previous input or hidden layer. These activation functions may vary and be based on not only the type of task associated with the AI-based classifier, but may also vary based on the specific type of hidden layer implemented.
Each hidden layer may perform a different function. For example, some hidden layers can be convolutional hidden layers which can, in some instances, reduce the dimensionality of the inputs, while other hidden layers can perform more statistical functions such as max pooling, which may reduce a group of inputs to the maximum value, an averaging layer, among others. In some of the hidden layers, each node may be connected to each node of the next hidden layer. Some neural networks including more than, for example, three hidden layers may be considered deep neural networks.
The last hidden layer in the artificial neural network is connected to the output layer. Similar to the input layer, the output layer typically has the same number of nodes as the possible outputs. In an example in which the AI-based classifier is a multiclass classifier, the output layer may include, for example, a number of different nodes, where each different node corresponds to a different class or label of the esophageal measurement data. A first node may indicate that the esophageal measurement data are classified as a normal class type, a second node may indicate that the esophageal measurement data are classified as an abnormal-not achalasia class type, and a third node may indicate that the esophageal measurement data are classified as an abnormal-achalasia class type. Additionally or alternatively, an additional node may indicate that the esophageal measurement data corresponds to an unknown (or unidentifiable) class. In some embodiments, the computer system then selects the output node with the highest value and indicates to the computer system or to the user the corresponding classification of the esophageal measurement data (e.g., by outputting and/or displaying the classified feature data). In some embodiments, the computer system may also select more than one output node.
Referring now to
The method includes accessing esophageal measurement data, which may include esophageal pressure and geometry (e.g., diameter or other geometric measurements) data with a computer system, as indicated at step 402. As one non-limiting example, the esophageal pressure and geometry data can be FLIP data acquired from a subject's esophagus using a FLIP system. Additionally or alternatively, the esophageal pressure and geometry data may include other manometry, planimetry, and/or panometry data. The esophageal pressure and geometry data may include measurement values or plots or measurement values. In some instances, the esophageal pressure and geometry data may include two-dimensional images, or heat maps, that depict a spatial and/or spatiotemporal distribution of esophageal pressure and/or geometric measurement values. Additionally or alternatively, the esophageal measurement data may include data such as pump status (e.g., inflated, deflated, or stopped), readings from the sensor pairs on the catheter balloon that indicate the diameter of the lumen, balloon pressure, balloon volume, and/or balloon temperature.
Accessing the esophageal measurement data may include retrieving such data from a memory or other suitable data storage device or medium. Alternatively, accessing the esophageal measurement data may include acquiring such data with a suitable measurement device, such as a functional lumen imaging probe, and transferring or otherwise communicating the data to the computer system.
In some instances, the measurements of esophageal pressure and/or geometry may include artifacts, such as artifacts related to the diameter measured during periods of strong contraction. During contractions where the lumen is occluded, the measurements may be negated as the contraction can interrupt the flow of current within the catheter. These artifacts can therefore be detected in the data, and the data processed accordingly to remove the artifacts.
A trained neural network (or other suitable machine learning algorithm) is then accessed with the computer system, as indicated at step 404. Accessing the trained neural network may include accessing network parameters (e.g., weights, biases, or both) that have been optimized or otherwise estimated by training the neural network on training data. In some instances, retrieving the neural network can also include retrieving, constructing, or otherwise accessing the particular neural network architecture to be implemented. For instance, data pertaining to the layers in the neural network architecture (e.g., number of layers, type of layers, ordering of layers, connections between layers, hyperparameters for layers) may be retrieved, selected, constructed, or otherwise accessed. As a non-limiting example, the trained neural network may be a trained convolutional neural network.
In general, the neural network is trained, or has been trained, on training data in order to identify patterns (e.g., contractile response patterns) in the esophageal pressure and geometry data, classify the esophageal pressure and geometry data based on the identified patterns, and to generate output as classified data and/or feature data representative of different upper gastrointestinal disorder classifications and/or probability scores of different upper gastrointestinal disorder classifications.
The esophageal pressure and geometry data are then input to the trained neural network, generating output as classified feature data, as indicated at step 406. For example, the classified feature data may include a classification of the subject as belonging to a particular classification of upper gastrointestinal disorder, a quantifiable probability score of the subject belonging to one or more upper gastrointestinal disorders, and the like. As one example, the classified feature data may indicate the probability for a particular classification (i.e., the probability that a subject belongs to a particular class), such as normal, abnormal-not achalasia, and abnormal-achalasia.
In some embodiments, the classified feature data may indicate that a particular distention-induced contractility pattern is present in the esophageal measurement data. Examples of different distention-induced contractility patterns are described below with respect to the labeling of training data (e.g., with respect to
The classified feature data generated by inputting the esophageal measurement data to the trained neural network(s) can then be displayed to a user, stored for later use or further processing, or both, as indicated at step 408.
Referring now to
In general, the neural network(s) can implement any number of different neural network architectures. For instance, the neural network(s) could implement a convolutional neural network, a residual neural network, or the like. In some instances, the neural network(s) may implement deep learning.
Alternatively, the neural network(s) could be replaced with other suitable machine learning algorithms, such as those based on supervised learning, unsupervised learning, deep learning, ensemble learning, dimensionality reduction, and so on.
The method includes accessing training data with a computer system, as indicated at step 502. Accessing the training data may include retrieving such data from a memory or other suitable data storage device or medium. Alternatively, accessing the training data may include acquiring such data with a FLIP system, or other suitable measurement system, and transferring or otherwise communicating the data to the computer system, which may be a part of the FLIP or other suitable measurement system. In general, the training data can include esophageal measurement data, such as esophageal pressure and diameter measurement data.
Additionally or alternatively, the method can include assembling training data from esophageal measurement data using a computer system. This step may include assembling the esophageal measurement data into an appropriate data structure on which the machine learning algorithm can be trained. Assembling the training data may include assembling esophageal measurement data, segmented esophageal measurement data, labeled esophageal measurement data, and other relevant data. For instance, assembling the training data may include generating labeled data and including the labeled data in the training data. Labeled data may include esophageal measurement data, segmented esophageal measurement data, or other relevant data that have been labeled as belonging to, or otherwise being associated with, one or more different classifications or categories.
As one non-limiting example, labeled data may include esophageal measurement data and/or segmented esophageal measurement data that have been labeled based on different distention-induced contractility patterns. For instance, the labeled data may include esophageal measurement data labeled as including a repetitive antegrade contractions (“RAC”) pattern, such as the RAC pattern illustrated in
In some instances, the labeled data may include esophageal measurement data labeled as including a repeating contractile response pattern. As an example, the repeating contractile pattern may include a repeating RAC pattern, such as the repeating RAC patterns shown in
Other example contractile response patterns may include normal contractile response (“NCR”), borderline/diminished contractile response (“BDCR”), borderline contractile response (“BCR”), impaired/disordered contractile response (“ID CR”), spastic contractile response (“SCR”), and/or spastic-reactive contractile response (“SRCR”). Example pathophysiology characterizations and definitions of these contractile response patterns are described below. Examples of these contractile response patterns are illustrated in
NCR can be representative of a pathophysiology indicating normal neurogenic control and muscular function. As an example, NCR can be defined based on a rule of sixes (“RO6”), in which six normal contractions are observed or otherwise recorded over a period of time, such as per minute. For instance, a RO6 criterion can be satisfied when ≥6 consecutive ACs that are ≥6 cm in axial length occurring at 6±3 AC per minute regular rate.
BCR can be defined as a contractile pattern that does not satisfy the RO6 criterion, in which a distinct AC of at least 6 cm axial length is present, that may have RCs, but not RRCs; and has no SOCs or sLESCs.
BDCR can be representative of a pathophysiology indicating early transition/borderline loss of neurogenic control, which can be evidenced by fewer ACs, delayed triggering at higher volumes, and possible a higher rate of ACs. Additionally or alternatively, BDCR can be representative of a pathophysiology indicating early transition/borderline muscular dysfunction, which can be evidenced by fewer ACs becoming weaker, and may see slower more pronounced contractions that may reflect hypertrophy as an early phase of response to obstruction. As an example, BDCR can be defined as contractile patterns not meeting the RO6 criterion and in which antegrade contractions (“ACs”) are present; retrograde contractions (“RCs”) may be present, but not RRCs; and no sustained occluding contractions (“SOCs”) are present.
IDCR can be representative of a pathophysiology indicating late progression/severe loss of neurogenic control and/or muscular function, which can be evidenced by sporadic or chaotic contractions with no propagation or progressing achalasia, and/or response to distension is not distinct or associated with volume trigger. As an example, IDCR can be defined as contractile patterns in which no distinct ACs are present; that may have sporadic or chaotic contractions not meeting ACs; that may have RCs, but not RRCs; and in which no SOCs are present.
ACR can be representative of a pathophysiology indicating complete loss of neurogenic trigger for secondary peristalsis, which can be related to neuropathy, CVD, diabetes, age, and/or chronic GERD, and may be evidenced by impaired triggering due to dilatation of the wall or loss of compliance. Additionally or alternatively, ACR can be representative of a pathophysiology indicating end-stage muscular dysfunction, such as esophageal dilatation, distortion of the anatomy, and/or atrophy. As an example, ACR can be defined as contractile patterns in which no contractile activity is present (e.g., no contractile activity in the esophageal cavity). In these instances LES-L may be present with no evidence of contraction in the body. As an example, the esophageal measurement data may indicate bag pressures greater than 40 mmHg.
SCR can be representative of a pathophysiology indicating neurogenic disruption leading to reduced latency and sustained contraction, which may be representative of an intrinsic neurogenic dysfunction and/or a response to obstruction. As an example, SCR can be defined as contractile patterns in which SOCs are present, which may have sporadic ACs, and in which RRCs are present (e.g., at least 6 RCs at a rate >9 RCs per minute). Similarly, SRCR can be defined as contractile patterns in which SOCs, sLESCs, or RRCs (at least 6 RCs at a rate >9 RCs per minute) are present, and that may have sporadic ACs.
As still another example, the labeled data may include esophageal measurement data that are labeled as containing sustained occluding contractions (“SOCs”), as shown in
Additional examples of contractile response patterns that can be used when generating labeled data, or which can be identified as classified feature data, are shown in
As described above, in some instances, the measurements of esophageal pressure and/or geometry may include artifacts, such as artifacts related to the diameter measured during periods of strong contraction. These artifacts can be detected and removed from the esophageal measurement data, as described above.
In
Referring again to
Training a neural network may include initializing the neural network, such as by computing, estimating, or otherwise selecting initial network parameters (e.g., weights, biases, or both). Training data can then be input to the initialized neural network, generating output as classified feature data. The quality of the classified feature data can then be evaluated, such as by passing the classified feature data to the loss function to compute an error. The current neural network can then be updated based on the calculated error (e.g., using backpropagation methods based on the calculated error). For instance, the current neural network can be updated by updating the network parameters (e.g., weights, biases, or both) in order to minimize the loss according to the loss function. When the error has been minimized (e.g., by determining whether an error threshold or other stopping criterion has been satisfied), the current neural network and its associated network parameters represent the trained neural network. Different types of training algorithms can be used to adjust the bias values and the weights of the node connections based on the training examples. The training algorithms may include, for example, gradient descent, Newton's method, conjugate gradient, quasi-Newton, Levenberg-Marquardt, among others.
The one or more trained neural networks are then stored for later use, as indicated at step 506. Storing the neural network(s) may include storing network parameters (e.g., weights, biases, or both), which have been computed or otherwise estimated by training the neural network(s) on the training data. Storing the trained neural network(s) may also include storing the particular neural network architecture to be implemented. For instance, data pertaining to the layers in the neural network architecture (e.g., number of layers, type of layers, ordering of layers, connections between layers, hyperparameters for layers) may be stored.
In addition to training neural networks, other machine learning or classification algorithms can also be trained and implemented for generating classified feature data. As one example, esophageal measurement data can be classified by computing parameters from the esophageal measurement data and classifying the esophageal measurement data based in part on those computed parameters. For instance, esophagogastric junction (“EGJ”) distensibility index (“EGJ-DI”) can be computed and used to classify esophageal measurement data. The EGJ-DI can be computed as,
where Narrowest CSAEGJ is the narrowest cross-sectional area of the EGJ measured in the esophageal measurement data. An example table of EGJ-DI values is shown in
An example classification scheme based on EGJ-DI is shown in
An example workflow for implementing a classification scheme according to some embodiments described in the present disclosure is shown in
Additional example classification schemes that utilize both EGJ-DI (and/or other measured parameters) and contractile response patterns are shown in
The present disclosure has described one or more preferred embodiments, and it should be appreciated that many equivalents, alternatives, variations, and modifications, aside from those expressly stated, are possible and within the scope of the invention.
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 63/079,060 filed on Sep. 16, 2020, and entitled “CLASSIFICATION OF FUNCTIONAL LUMEN IMAGING PROBE DATA,” and of U.S. Provisional Patent Application Ser. No. 63/201,599 filed on May 5, 2021, and entitled “CLASSIFICATION OF FUNCTIONAL LUMEN IMAGING PROBE DATA,” both of which are herein incorporated by reference in their entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2021/071470 | 9/15/2021 | WO |
Number | Date | Country | |
---|---|---|---|
63201599 | May 2021 | US | |
63079060 | Sep 2020 | US |