The present disclosure relates to a technical field of an image processing device, an image processing method, and a storage medium for processing an image to be acquired in endoscopic examination.
An endoscopic examination system for displaying images taken in the lumen of an organ is known. For example, Patent Literature 1 discloses a learning method of a learning model that outputs information relating to a lesion part included in an endoscope image data when the endoscope image data generated by a photographing device is inputted thereto. Further, Patent Literature 2 discloses a classification method for classifying series data through an application method of the sequential probability ratio test (SPRT: Sequential Probability Ratio Test). Non-Non-Patent Literature 1 also discloses an approximate computation method of the matrix when performing multi-class classification in the SPRT-based method disclosed in Patent Literature 2.
In the case of detecting a lesion from images taken in the endoscopic examination, there are a method of detecting the lesion based on a fixed predetermined number of images, and a method of detecting the lesion based on a variable number of images as described in Patent Literature 2. Then, the lesion detection method based on a fixed number of images leads to lesion detection with high accuracy even when there is no change in the images, but there is an issue that it is susceptible to noise such as blurring. In contrast, the lesion detection method based on a variable number of images described in Patent Literature 2 is less susceptible to instantaneous noises while being able to easily detect a distinguishable lesion, however, there is an issue that it could lead to detection delay or miss of the lesion when there is no substantial change between the images.
In view of the above-described issue, it is therefore an example object of the present disclosure to provide an image processing device, an image processing method, and a storage medium capable of suitably detecting a lesion in an endoscopic image.
One mode of the image processing device is an image processing device including:
One mode of the image processing method is an image processing method executed by a computer, the image processing method including:
One mode of the storage medium is a storage medium storing a program executed by a computer, the program causing the computer to:
An example advantage according to the present disclosure is to suitably detect a lesion in an endoscope image.
Hereinafter, example embodiments of an image processing device, an image processing method, and a storage medium will be described with reference to the drawings.
The image processing device 1 acquires an image (also referred to as “endoscopic image Ia”) captured by the endoscope 3 in time series from the endoscope 3 and displays a screen image based on the endoscopic image Ia on the display device 2. The endoscopic image Ia is an image captured at predetermined time intervals in at least one of the insertion process of the endoscope 3 to the subject or the ejection process of the endoscope 3 from the subject. In the present example embodiment, the image processing device 1 analyzes the endoscopic image Ia to detect the endoscopic image Ia in which the lesion part is included, and displays the information regarding the detection result on the display device 2.
The display device 2 is a display or the like for display information based on the display signal supplied from the image processing device 1.
The endoscope 3 mainly includes an operation unit 36 for examiner to perform a predetermined input, a shaft 37 which has flexibility and which is inserted into the organ to be photographed of the subject, a pointed end unit 38 having a built-in photographing unit such as an ultra-small image pickup device, and a connecting unit 39 for connecting with the image processing device 1.
The configuration of the endoscopic examination system 100 shown in
Hereafter, as a representative example, the description will be given of the process in the endoscopic examination of the large bowel. However, the examination target is not limited to the large bowel and it may be an esophagus or stomach. Examples of the target of the endoscopic examination in the present disclosure include a laryngendoscope, a bronchoscope, an upper digestive tube endoscope, a duodenum endoscope, a small bowel endoscope, a large bowel endoscope, a capsule endoscope, a thoracoscope, a laparoscope, a cystoscope, a cholangioscope, an arthroscope, a spinal endoscope, a blood vessel endoscope, and an epidural endoscope. In addition, the conditions of the lesion part to be detected in endoscopic examination are exemplified as (a) to (f) below.
The processor 11 executes a predetermined process by executing a program or the like stored in the memory 12. The processor 11 is one or more processors such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and a TPU (Tensor Processing Unit). The processor 11 may be configured by a plurality of processors. The processor 11 is an example of a computer.
The memory 12 is configured by a variety of volatile memories which are used as working memories, and nonvolatile memories which store information necessary for the process to be executed by the image processing device 1, such as a RAM (Random Access Memory) and a ROM (Read Only Memory). The memory 12 may include an external storage device such as a hard disk connected to or built in to the image processing device 1, or may include a storage medium such as a removable flash memory. The memory 12 stores a program for the image processing device 1 to execute each process in the present example embodiment.
Further, the memory 12 functionally includes a first model information storage unit D1 for storing first model information, and a second model information storage unit D2 for storing second model information. The first model information includes parameter information of a first model that is used by the image processing device 1 to detect a lesion part. The first model information may further include information indicating the calculation result of the detection process of the lesion part using the first model. The second model information includes parameter information of a second model that is used by the image processing device 1 to detect a lesion part. The second model information may further include information indicating the calculation result of the detection process of the lesion part using the second model.
The first model is a model to make an inference regarding a lesion part of the examination target based on a fixed predetermined number (may be one or may be multiple) of endoscopic images. Specifically, the first model is a model which learned the relation between a fixed predetermined number of endoscopic images or the feature values thereof, to be inputted to the first model, and a determination result regarding the lesion part in the endoscopic images. In other words, the first model is a model that is trained to output, when input data that is a fixed predetermined number of endoscopic images or their feature values is inputted thereto, a determination result regarding a lesion part in the endoscopic images. In the present example embodiment, the determination result regarding the lesion part outputted from the first model includes at least a score regarding the presence or absence of the lesion part in the endoscopic images, and this score is hereafter also referred to as “first score S1”. For convenience of explanation, the first score S1 shall indicate that the higher the first score S1 is, the higher the degree of confidence that there is a lesion part in the endoscopic images of interest becomes. The above-described determination result regarding the lesion part may further include information indicating the position or region (area) of the lesion part in the endoscopic image.
The first model is, for example, a deep learning model which includes a convolutional neural network in its architecture. Examples of the first model include Fully Convolutional Network, SegNet, U-Net, V-Net, Feature Pyramid Network, Mask R-CNN, and DeepLab. The first model data storage unit D1 includes various parameters required for building the first model, such as a layer structure, a neuron structure of each layer, the number of filters and the size of filters in each layer, and the weight for each element of each filter. The first model is trained in advance on the basis of sets of: endoscopic images that are input data conforming to the input format of the first model, or feature values thereof; and correct answer data indicating a determination result of a correct answer regarding the lesion part in the endoscopic images.
The second model is a model configured to make an inference regarding a lesion part of the examination target based on a variable number of endoscopic images. Specifically, the second model is a model which learned, through machine learning, a relation between a variable number of endoscopic images or their feature values and a determination result regarding the lesion part in the endoscopic images. In other words, the second model is a model that is trained to output, when input data that is a variable number of endoscopic images or their feature values is inputted thereto, the determination result on the lesion part in the endoscopic images. In the present example embodiment, the “determination result regarding the lesion part” includes at least a score regarding the presence or absence of the lesion part in the endoscopic images, and this score is hereinafter also referred to as “second score S2”. For convenience of explanation, the second score S2 shall indicate that the higher the second score S2 is, the higher the degree of confidence that there is a lesion part in the endoscopic images of interest becomes. Examples of the second model include a model based on SPRT described in Patent Literature 2. A specific example of the second model based on SPRT will be described later. Various parameters required for building the second model are stored in the second model information storage unit D2.
Further, in the memory 12, in addition to the first model information and the second model information, various information such as parameters necessary for the lesion detection process is stored. At least a portion of the information stored in the memory 12 may be stored in an external device other than the image processing device 1 instead. In this case, the above-described external device may be one or more server devices capable of performing data communication with the image processing device 1 through a communication network or through direct communication.
The interface 13 performs an interface operation between the image processing device 1 and an external device. For example, the interface 13 supplies the display information “Ib” generated by the processor 11 to the display device 2. Further, the interface 13 supplies the light generated by the light source unit 15 to the endoscope 3. The interface 13 also provides an electrical signal to the processor 11 indicative of the endoscopic image Ia supplied from the endoscope 3. The interface 13 may be a communication interface, such as a network adapter, for wired or wireless communication with the external device, or a hardware interface compliant with a USB (Universal Serial Bus), a SATA (Serial AT Attachment), or the like.
The input unit 14 generates an input signal based on the operation by the examiner. Examples of the input unit 14 include a button, a touch panel, a remote controller, and a voice input device. The light source unit 15 generates light for supplying to the pointed end unit 38 of the endoscope 3. The light source unit 15 may also incorporate a pump or the like for delivering water and air to be supplied to the endoscope 3. The audio output unit 16 outputs a sound under the control of the processor 11.
Next, an outline of the detection process (lesion detection process) of a lesion part by the image processing device 1 will be described. In summary, the image processing device 1 switches between a lesion detection process based on the first model and a lesion detection process based on the second model, based on the degree of variation between time series endoscopic images Ia. Thus, the image processing device 1 selects a suitable model from the first model and the second model according to the situation, and performs detection of the lesion part with high accuracy.
The endoscopic image acquisition unit 30 acquires the endoscopic image Ia captured by the endoscope 3 through the interface 13 at predetermined intervals according to the frame period of the endoscope 3, and supplies the acquired endoscopic image Ia to the variation detection/selection unit 31 and the display control unit 35. Then, each processing unit provided in the subsequent stage periodically performs the processing described later at the time intervals at which the endoscope image acquisition unit 30 acquires each endoscope image. Hereinafter, the time at intervals of the frame period is also referred to as “processing time”.
The variation detection/selection unit 31 calculates a score (also referred to as “variation score”) representing the degree of variation between the endoscopic image Ia (also referred to as a “current processing image”) at the time index t representing the current processing time and the endoscopic image Ia (also referred to as “past image”) acquired at the time (i.e., the time index “t-1”) immediately before the current processing time. The variation score increases with increase in the degree of variation between the current processing image and the past image. For example, the variation detection/selection unit 31 calculates, as the variation score, a value of any similarity index based on comparison of images (i.e., comparison between images). Examples of the similarity index in this case include the correlation coefficient, SSIM (Structural Similarity) index, PSNR (Peak Signal-to-Noise Ratio) index, and the square error between corresponding pixels.
Further, the variation detection/selection unit 31 determines, based on the variation score, which of the first model or the second model is selected as a model (also referred to as a “selection model”) to be used for determining whether or not there is a lesion part.
The lesion detection process based on the first model is advantageous in that the first model is capable of detecting a lesion even under conditions where the second score S2 (i.e., the logarithmic likelihood ratio to be described later) based on the second model does not easily increase when there is no temporal change in the endoscopic images Ia (i.e., when the variation score is relatively low). In contrast, the lesion detection process based on the second model is advantageous in that the second model is robust to instantaneous noises and capable of promptly detecting a lesion part that is easily distinguishable.
In consideration of the above, if the variation score is equal to or less than a predetermined threshold value (also referred to as “variation threshold value”), the variation detection/selection unit 31 determines the selection model to be the first model, and supplies the command to calculate the first score S1 and the endoscopic image Ia to the first score calculation unit 32. Thus, the lesion detection process based on the first model is performed, which leads to accurate lesion detection even when the temporal variation between the endoscopic images Ia is small. On the other hand, if the variation score is larger than the variation threshold value, the variation detection/selection unit 31 determines the selection model to be the second model and supplies the command to calculate the second score S2 and the endoscopic image Ia to the second score calculation unit 33. Thus, the lesion detection process based on the second model is performed, which leads to accurate lesion detection even when the temporal variation between the endoscopic images Ia is large.
The first score calculation unit 32 calculates the first score S1 with reference to the first model information storage unit D1 when receiving the command to calculate the first score S1 from the variation detection/selection unit 31. In this instance, the first score calculation unit 32 acquires the first score S1 outputted by the first model by inputting the endoscopic image Ia to the first model which is configured by referring to the first model information storage unit D1. If the first model is a model configured to output the first score S1 based on a single endoscopic image Ia, the first score calculation unit 32 calculates the first score S1 at the current processing time by inputting the endoscopic image Ia obtained at the current processing time to the first model, for example. In contrast, if the first model is a model configured to output the first score S1 on the basis of a plurality of endoscopic images Ia, the first score calculation unit 32 may calculate the first score S1 at the current processing time by inputting, into the first model, a combination of the endoscopic image Ia obtained at the current processing time and the endoscopic image(s) Ia obtained in the past processing time(s), for example. The first score calculation unit 32 may calculate the first score S1 (i.e., moving-averaged score) obtained by averaging the score(s) obtained at the past processing time(s) and the score obtained at the current processing time. The first score calculation unit 32 supplies the calculated first score S1 to the lesion detection unit 34.
When receiving the command to calculate the second score S2 from the the variation detection/selection unit 31, the second score calculation unit 33 calculates the second score S2 indicating the likelihood of the presence of the lesion part based on information stored in the second model information storage unit D2 and the variable number of time-series endoscopic images Ia obtained up to the present. In this instance, for each process time, the second score calculation unit 33 determines the second score S2 based on the likelihood ratio regarding the time series endoscopic images Ia, which is calculated using the second model based on SPRT. Here, “likelihood ratio regarding the time series endoscopic images Ia” refers to the ratio of the likelihood that there is a lesion part in the time series endoscopic images Ia to the likelihood that there is no lesion part in the time series endoscopic images Ia. In the present example embodiment, as an example, it is assumed that the likelihood ratio increases with an increase in the likelihood of the presence of the lesion part. Specific examples of the method for calculating the second score S2 using the second model based on SPRT will be described later. The second score calculation unit 33 supplies the calculated second score S2 to the lesion detection unit 34.
At least one of the first model and/or the second model may have an architecture of a feature extractor configured to convert an endoscopic image Ia into feature values (e.g., feature vectors or tensor data in the third or higher order) represented in a predetermined dimensional feature space. In this case, the feature extractor may be a deep learning model with an architecture such as a convolutional neural network. In this case, machine learning is applied to the feature extractor in advance, and the parameters obtained through the machine learning are stored in advance in the memory 12 or the like. The feature extractor may extract feature values representing the relation among time series data, based on any technique for calculating the relation among the time series data such as LSTM (Long Short Term Memory). If only one of the first model or the second model includes the feature extractor, the the first score calculation unit 32 and the second score calculation unit 33 may exchange the feature data indicating the feature values outputted by the feature extractor to share the feature data.
Further, the feature extractor may be a model independent of the first model and the second model. In this case, for example, the feature extraction unit which converts the endoscope image Ia to feature values based on a feature extractor is provided between the variation detection/selection unit 31 and the first score calculation unit 32 or the second score calculation unit 33. The feature extraction unit builds the feature extractor based on the parameters previously stored in the memory 12 or the like, and then acquires the feature values outputted by the feature extractor by inputting the endoscopic image Ia into the feature extractor, and supplies the feature data indicating the feature values to either the first score calculation unit 32 or the second score calculation unit 33 according to the selection result of the model by the variation detection/selection unit 31. In this case, the first score calculation unit 32 acquires the first score S1 outputted by the first model by inputting the feature values of the endoscopic image(s) Ia into the first model built by referring to the first model information storage unit D1. The second score calculation unit 33 acquires the second score S2 outputted by the second model by inputting the feature values of the time-series endoscopic images Ia into the second model built by referring to the second model information storage unit D2.
The lesion detection unit 34 performs the lesion detection process (i.e., determination of presence or absence of the lesion) on the endoscopic images Ia, based on the first score S1 supplied from the first score calculation unit 32, or, the second score S2 supplied from the second score calculation unit 33. In this case, the lesion detection unit 34 determines whether or not there is a lesion part, based on the score (the first score S1 or the second score S2) of the selection model (the first model or the second model) selected by the variation detection/selection unit 31. Specific examples of the process executed by the lesion detection unit 34 will be described later. The lesion detection unit 34 supplies the lesion detection result to the display control unit 35.
The display control unit 35 generates display information Ib, based on the endoscopic image Ia and the lesion detection result supplied from the lesion detection unit 34, then supplies the display information Ib to the display device 2 through the interface 13 to thereby cause the display device 2 to display information regarding the endoscopic image Ia and the lesion detection result outputted by the lesion detection unit 34. The display control unit 35 may further display information (including information regarding the score calculated by the selection model) regarding the selection model used in the lesion detection process on the display device 2.
The display control unit 35 herein displays, in the real time image display area 71, a moving image representing the latest endoscopic image Ia. Furthermore, in the lesion detection result display area 72, the display control unit 35 displays the lesion detection result outputted by the lesion detection unit 34. At the time of providing the display screen image shown in
Further, in the processing detail display area 73, the display control unit 35 displays a variation score, a selection model, and a graph indicating the transition of the score that the selection model calculates. Here, as an example, the display control unit 35 expresses the variation score by “high” or “low”, and herein displays the variation score as “low” because the variation score is equal to or less than the variation threshold value. Further, since the variation score is equal to or less than the variation threshold value, and since the lesion detection process based on the first model has been performed, the display control unit 35 displays that the lesion detection process is ongoing based on the first model. Furthermore, the display control unit 35 displays, as a graph indicating the transition of the score of the selection model, a score transition graph indicating the transition of the first score S1 from the start of the endoscopic examination to the present, together with a dashed line indicating a criterion value (first score threshold value Sth1 to be described later) of the first score S1 for determining the presence or absence of a lesion.
Each component of the endoscopic image acquisition unit 30, the variation detection/selection unit unit 31, the first score calculation unit 32, the second score calculation unit 33, the lesion detection unit 34 and the display control unit 35 can be realized, for example, by the processor 11 which executes a program. In addition, the necessary program may be recorded in any non-volatile storage medium and installed as necessary to realize the respective components. In addition, at least a part of these components is not limited to being realized by a software program and may be realized by any combination of hardware, firmware, and software. At least some of these components may also be implemented using user-programmable integrated circuitry, such as FPGA (Field-Programmable Gate Array) and microcontrollers. In this case, the integrated circuit may be used to realize a program for configuring each of the above-described components. Further, at least a part of the components may be configured by a ASSP (Application Specific Standard Produce), ASIC (Application Specific Integrated Circuit) and/or a quantum processor (quantum computer control chip). In this way, each component may be implemented by a variety of hardware. The above is true for other example embodiments to be described later. Further, each of these components may be realized by the collaboration of a plurality of computers, for example, using cloud computing technology.
Next, an exemplary calculation of the second score S2 using the second model based on SPRT will be described.
The second score calculation unit 33 calculates the likelihood ratio relating to latest “N” number of endoscopic images Ia (N is an integer of 2 or more) for each processing time, and determines the second score S2 based on the likelihood ratio (also referred to as “integrated likelihood ratio”) into which the likelihood ratio calculated at the current processing time and the likelihood(s) at past processing time(s) are integrated. The second score S2 may be the integrated likelihood ratio itself or may be a function value including the integrated likelihood ratio as an argument. Hereinafter, for convenience of explanation, the second model shall include a likelihood ratio calculation model that is a processing unit for calculating the likelihood ratio and a score calculation model that is a processing unit for calculating the second score S2 from the likelihood ratio.
The likelihood ratio calculation model is a model that is trained to output, when feature data of N endoscopic images Ia is inputted thereto, the likelihood ratio regarding the N endoscopic images Ia. The likelihood ratio calculation model may be a deep learning model, a statistical model, or any other machine learning model. In this instance, for example, the learned parameters of the second model including the likelihood ratio calculation model are stored in the second model information storage unit D2. When the likelihood ratio calculation model is constituted by a neural network, various parameters such as a layer structure, a neuron structure of each layer, the number of filters and the filter size in each layer, and the weight for each element of each filter are stored in advance in the second model information storage unit D2. It is noted that even when the number of acquired endoscopic images Ia is less than N, the second score calculation unit 33 can acquire the likelihood ratio from the acquired less than N endoscopic images Ia using the likelihood ratio calculation model. The second score calculation unit 33 may store the acquired likelihood ratio in the second model information storage unit D2.
Next, the score calculation model included in the second model will be described. It is hereinafter assumed that the index “1” denotes a predetermined start time, the index “t” denotes the current processing time, and that “x1” (i=1, . . . , t) denotes the feature values of the endoscopic images Ia to be processed at the processing time i. The “start time” represents the first processing time of the past processing times to be considered in the calculation of the second score S2. In this instance, the integrated likelihood ratio for the binary classification between the class “C1” indicating that the endoscopic images Ia contain the lesion part and the class “C0” indicating that the endoscopic images Ia does not contain the lesion part is expressed by the following equation (1).
Here, “p” represents the probability (i.e., confidence score ranging from 0 to 1) belonging to each class. In calculating the term on the right-hand side of the equation (1), the likelihood ratio outputted by the likelihood ratio calculation model can be used.
In the equation (1), since the time index t representing the current processing time increases over time, the length (i.e., the number of frames) of the time series endoscopic images Ia used for calculating the integrated likelihood ratio is a variable length. Thus, the first advantage of using the integrated likelihood ratio based on the equation (1) is that the second score calculation unit 33 can calculate the second score S2 considering a variable number of the endoscopic images Ia. The second advantage of using the integrated likelihood ratio based on the equation (1) is that the time-dependent features can be classified. The third advantage thereof is that it is possible to calculate the second score S2 with sufficient accuracy even for discriminant-difficult data. The second score calculation unit 33 may store the integrated likelihood ratio and the second score S2 calculated at the respective processing times in the second model information storage unit D2.
The second score calculation unit 33 may determines that there is no lesion part if the second score S2 reaches a predetermined threshold value which is a negative value. In this case, the second score calculation unit 33 initializes the second score S2 and the time index t to 0, and restarts the calculation of the second score S2 on the basis of the endoscopic images Ia obtained at subsequent processing times.
Next, a description will be given of a specific method of determining whether or not there is a lesion part by the lesion detection unit 34. The lesion detection unit 34 determines, based on the score of the selection model determined by the variation detection/selection unit 31, whether or not there is a lesion part.
In the example shown in
On the other hand, in the period from the processing time t10 to the processing time “t20”, since the variation score is larger than the variation threshold value, the variation detection/selection unit 31 determines the selection model to be the second model in the period, and the lesion detection unit 34 determines, based on the second score S2, whether or not there is a lesion part. In the period after the processing time t20, since the variation score is equal to or less than the variation threshold value again, the variation detection/selection unit 31 determines the selection model to be the first model in the period and the lesion detection unit 34 determines, based on the first score S1, whether or not there is a lesion part.
When the selection model is the first model, the lesion detection unit 34 compares the first score S1 with a threshold value (also referred to as “first score threshold value Sth1”) for the first score S1 at each processing time. Then, if the first score S1 consecutively exceeds the first score threshold value Sth1 for more than a predetermined times (also referred to as “threshold count Mth”), the lesion detection unit 34 determines that there is a lesion part.
Hereafter, the number of times the first score S1 has consecutively exceeded the first score threshold value Sth1 is referred to as “over-threshold consecutive count M”. It is noted that fitting values for the first score threshold value Sth1 and the second score threshold value Sth2 are stored in advance in the memory 12 or the like, respectively, for example.
In the example shown in
When the selection model is the second model, the lesion detection unit 34 compares the second score S2 obtained at each processing time with a threshold value (also referred to as “second score threshold value Sth2”) for the second score S2. Then, if the second score S2 becomes larger than the second score threshold value Sth2, the lesion detection unit 34 determines that there is a lesion part. On the other hand, if the second score S2 becomes smaller than a predetermined threshold value which is a negative value, the lesion detection unit 34 determines that there is no lesion part. For example, the predetermined threshold value described above is set to the negative value (i.e., −Sth2) whose absolute value is equal to the second score threshold value Sth2. The fitting values of the second score threshold value Sth2 and the predetermined threshold value described above are stored in advance in the memory 12 or the like.
In the example shown in
A supplemental description will be given of the advantages and disadvantages of the lesion detection process by the first model based on a convolutional neural network and the lesion detection process by the second model based on SPRT.
If a model based on a convolutional neural network is used for lesion detection, the presence or absence of detected lesion is determined by comparing the over-threshold consecutive count M with the threshold count Mth in order to improve the specificity. In such a lesion detection, there is such an advantage that it is possible to detect a lesion even under the circumstances in which logarithmic likelihood ratio to be calculated by the second model based on SPRT does not easily increase, e.g., there is no substantial variation in endoscopic images Ia with time. On the other hand, the lesion detection is susceptible to noises (including blurring) compared to the lesion detection process based on the second model and it could need a lot of the number of endoscopic images Ia to detect a lesion part even when the lesion part is easily distinguishable. In contrast, the second model based on SPRT is robust to instantaneous noises and can promptly detect a lesion part that is easily distinguishable. Unfortunately, in such a case that there is little temporal variation in endoscopic images Ia, the logarithmic likelihood ratio becomes hard to increase, and therefore the number of endoscopic images Ia required to detect a lesion could increase. In view of the above, in the present example embodiment, the image processing device 1 adaptively switches, based on the variation score, the selection model to be used for the lesion detection process between the first model and the second model. Thus, it is possible to set the selection model to the first model when the lesion detection process based on the first model is effective while setting the selection model to the second model when the lesion detection process based on the second model is effective. It is therefore possible to suitably increase the lesion detection accuracy.
First, the endoscopic image acquisition unit 30 of the image processing device 1 acquires the endoscopic image Ia (step S11). In this instance, the endoscopic image acquisition unit 30 of the image processing device 1 receives the endoscopic image Ia from the endoscope 3 through the interface 13. The display control unit 35 executes a process of displaying the endoscopic image Ia acquired at step S11 on the display device 2.
Next, the variation detection/selection unit 31 of the image processing device 1 calculates a variation score based on the current processing image which is the endoscopic image Ia obtained at step S11 at the current processing time and the past image which is the endoscopic image Ia obtained at step S11 at the immediately preceding processing time. Then, the variation detection/selection unit 31 determines whether or not the variation score is larger than the variation threshold value (step S13).
If the variation detection/selection unit 31 determines that the variation score is larger than the variation threshold value (step S13; Yes), the image processing device 1 performs a lesion detection process based on the second model (step S14). In this case, the second score calculation unit 33 calculates the second score S2 based on the second model, and the lesion detection unit 34 determines the presence or absence of a lesion part based on the second score S2 and the second score threshold value Sth2.
On the other hand, if the variation detection/selection unit 31 determines that the variation score is equal to or less than the variation threshold value (step S13; No), the image processing device 1 performs the lesion detection process based on the first model (step S15). In this case, the first score calculation unit 32 calculates the first score S1 based on the first model, and then determines, based on a comparison result between the over-threshold consecutive count M and the threshold value count Mth, whether or not there is a lesion part, wherein the over-threshold consecutive count M is a continuous number of times that the first score S1 exceeds the first score threshold value Sth1.
Then, the image processing device 1 determines whether or not the endoscopic examination is completed (step S16). For example, the image processing device 1 determines that the endoscopic examination has been completed when a predetermined input or the like to the input unit 14 or the operation unit 36 is detected. When it is determined that the endoscopic examination has been completed (step S16; Yes), the image processing device 1 ends the process of the flowchart. On the other hand, when it is determined that the endoscopic examination has not been completed (step S16; No), the image processing device 1 gets back to the process at step S11.
Next, a description will be given of modifications of the first example embodiment described above. The following modifications may be arbitrarily combined.
Instead of calculating the variation score by directly comparing the current processing image with the past image, the variation detection/selection unit 31 may calculate the degree of similarity between the feature values of the current processing image and the feature values of the past image as the variation score. Thus, the variation detection/selection unit 31 can also calculate the variation score indicating the substantial degree of similarity between the current processed image and the past image and suitably determine the presence or absence of the variation.
After the examination, the image processing device 1 may process a moving image configured by endoscopic images Ia generated during the endoscopic examination.
For example, if a moving image to be processed is designated based on the user input by the input unit 14 at any timing after the examination, the image processing device 1 repeatedly performs process of the flowchart shown in
In the second example embodiment, in the lesion detection process based on the first model, the image processing device 1 changes the threshold count Mth to be used for the lesion detection process, based on the second score S2 outputted by the second model. Here, the threshold count Mth is a parameter that defines a condition for determining that a lesion is detected on the basis of the first score S1. The image processing device 1 changes the threshold count Mth so that, the higher the degree of confidence of the presence of a lesion indicated by the second score S2 is, the more the above-described condition is relaxed. Similarly, in the lesion detection process based on the second model, the image processing device 1 changes the second score threshold value Sth2 to be used for the lesion detection process, based on the first score S1 outputted by the first model. Here, the second score threshold value Sth2 is a parameter that defines a condition for determining that a lesion is detected on the basis of the second score S2. The image processing device 1 changes the second score threshold value Sth2 so that the higher the degree of confidence of the presence of a lesion indicated by the first score S1 is, the more the above-described condition is relaxed.
Hereinafter, substantially the same components of the endoscopic examination system 100 as in the first example embodiment will be denoted by the same reference numerals as appropriate and a description thereof will be omitted. The hardware configuration of the image processing device 1 according to the second example embodiment is substantially the same as the hardware configuration of the image processing device 1 shown in
First, the case of executing the lesion detection process based on the first model will be specifically described.
The first score calculation unit 32 and the second score calculation unit 33 respectively perform processing for calculating the first score S1 and the second score S2 at each processing time, respectively, regardless of the selection model determined by the variation detection/selection unit 31, and supply the calculation results to the lesion detection unit 34. Then, if the number of times the first score S1 exceeds the first score threshold value Sth1 in succession becomes larger than the threshold count Mth, the lesion detection unit 34 determines that there is a lesion part. On the other hand, if the second score S2 becomes larger than the second score threshold value Sth2, the lesion detection unit 34 reduces the threshold count Mth. In this way, the lesion detection unit 34 relaxes the condition for determining, based on the first score S1, that there is a lesion part in such a situation that the presence of the lesion part is suspected based on the second score S2 outputted by the second model. Thus, the lesion detection unit 34 can determine the presence or absence of the lesion part more accurately, in the lesion detection process based on the first model.
A description will be given of specific examples (first specific example and second specific example) of the lesion detection process based on the first model according to the second example embodiment with reference to
In this instance, at each processing time after the processing time t30, the lesion detection unit 34 compares the first score S1 obtained at each processing time with the first score threshold value Sth1, and compares the second score S2 obtained at each processing time with the second score threshold value Sth2, respectively. Then, at the processing time “t31”, the lesion detection unit 34 determines that the first score S1 exceeds the first score threshold value Sth1, and therefore starts counting the over-threshold consecutive count M. At the processing time “t32”, the lesion detection unit 34 determines that the over-threshold consecutive count M has exceeded the threshold count Mth. Therefore, in this instance, the lesion detection unit 34 determines that there is a lesion part in the endoscopic images Ia obtained in the processing time t31 to t32. On the other hand, after the processing time t30, the lesion detection unit 34 determines that the second score S2 is equal to or less than the second score threshold value Sth2, and fixes the threshold count Mth even after the processing time t30.
In the second specific example, at each processing time after the processing time t40, the lesion detection unit 34 compares the first score S1 obtained at each processing time with the first score threshold value Sth1, and compares the second score S2 obtained at each processing time with the second score threshold value Sth2, respectively. Then, during the period from the processing time “t41” to the processing time “t42”, since the first score S1 exceeds the first score threshold value Sth1, the over-threshold consecutive count M has increased. On the other hand, the first score S1 becomes equal to or less than the first score threshold value Sth1 after the processing time t42 while the over-threshold consecutive count M does not reach the threshold count Mth that is set to the initial value. Thus, the lesion detection unit 34 determines that there is no lesion part in the above-mentioned period.
On the other hand, at the processing time “t43”, the lesion detection unit 34 determines that the second score S2 is larger than the second score threshold value Sth2, and therefore sets the threshold count Mth to be a predetermined relaxation value (i.e., a value obtained by relaxing the condition for determining that there is a lesion part from the initial value) that is smaller than the initial value. The relaxation value of the threshold count Mth and the initial value of the threshold value count Mth are previously stored in the memory 12 and the like, for example.
Thereafter, after the processing time “t44”, the over-threshold consecutive count M has increased since the first score S1 exceeds the first score threshold value Sth1. Then, the first score S1 exceeds the first score threshold value Sth1 from the processing time t44 to the processing time “t45” and therefore the over-threshold consecutive count M exceeds the relaxation value of the threshold count Mth. Thus, the lesion detection unit 34 determines that there is a lesion part in the period from the processing time t44 to the processing time t45.
In this way, if the second score S2 based on the second model reaches the second score threshold values Sth2, the lesion detection unit 34 suitably relaxes the condition for determining that a lesion part is present, which suitably leads to improvement of the accuracy of the lesion detection process based on the first model. In addition, when there is a lesion which is easy to discriminate, the above-mentioned relaxation of the condition leads to rapid detection of a lesion by fewer number of endoscopic images Ia. In this case, reducing the number of endoscopic images Ia required for detection of a lesion part leads to decrease in the possibility of initialization of the over-threshold consecutive count M due to instantaneous noises.
Next, a case of executing a lesion detection process based on the second model will be specifically described.
The first score calculation unit 32 and the second score calculation unit 33 respectively perform processing for calculating the first score S1 and the second score S2 at each processing time, respectively, regardless of the selection model determined by the variation detection/selection unit 31, and supply the calculation results to the lesion detection unit 34. Then, at each processing time, the lesion detection unit 34 compares the first score S1 and the first score threshold value Sth1 and compares the second score S2 with the second score threshold value Sth2. Then, if the second score S2 becomes larger than the second score threshold value Sth2, the lesion detection unit 34 determines that there is a lesion part. On the other hand, in a period in which the over-threshold consecutive count M has increased, the lesion detection unit 34 decreases the second score threshold value Sth2 (i.e., relaxes the condition for determining that a lesion site is detected) in stages or continuously with increase in the over-threshold consecutive count M. Thereby, in the lesion detection process based on the second model, the lesion detection unit 34 relaxes the condition for detecting the lesion based on the second model, in accordance with the situation. It is therefore possible to more accurately determine the presence or absence of a lesion part.
In this instance, at each processing time after the processing time t50, the lesion detection unit 34 compares the first score S1 obtained at each processing time with the first score threshold value Sth1, and compares the second score S2 obtained at each processing time with the second score threshold value Sth2. Then, the lesion detection unit 34 determines at the processing time “t51” that the first score S1 exceeds the first score threshold value Sth1 and therefore increases the over-threshold consecutive count M.
Then, after the processing time t51 that is the starting time of the period in which the over-threshold consecutive count M has increased, the lesion detection unit 34 changes the second score threshold value Sth2 in accordance with the over-threshold consecutive count M. Here, the lesion detection unit 34 continuously decreases the second score threshold value Sth2 with increase in the over-threshold consecutive count M. Then, at the processing time “t52” included in the period in which the over-threshold consecutive count M has increased, the second score S2 becomes larger than the second score threshold value Sth2. Thus, the lesion detection unit 34 determines, at the processing time t52, that there is a lesion part.
In this way, the lesion detection unit 34 decreases the second score threshold value Sth2 with increase in the over-threshold consecutive count M. Thereby, it is possible to suitably relax the condition for detecting a lesion regarding the second score S2 based on the second model and accurately perform the lesion detection.
The image processing device 1 according to the second example embodiment executes process of a flowchart shown in
In
After executing the process at step S20, the lesion detection unit 34 determines whether or not the second score S2 is larger than the second score threshold value Sth2 (step S21). Then, if the second score S2 is larger than the second score threshold value Sth2 (step S21; Yes), the lesion detection unit 34 sets the threshold count Mth to the relaxation value smaller than the initial value (step S22). On the other hand, if the second score S2 is equal to or less than the second score threshold value Sth2 (step S21; No), the lesion detection unit 34 sets the threshold count Mth to the initial value (step S23).
Further, after executing the process at step S24, the lesion detection unit 34 determines whether or not the first score S1 is larger than the first score threshold value Sth1 (step S25). Then, if the first score S1 is larger than the first score threshold value Sth1 (step S25; Yes), the lesion detection unit 34 increases the over-threshold consecutive count M by 1 (step S26). It is herein assumed that the initial value of the over-threshold consecutive count M is set to 0. On the other hand, if the first score S1 is equal to or less than the first score threshold value Sth1 (step S25; No), the lesion detection unit 34 sets the over-threshold consecutive count M to 0 which is the initial value (step S27). Then, the process of the flowchart ends.
Next, after executing the process at step S22 or step S23 and the process at step S26, the lesion detection unit 34 determines whether or not the over-threshold consecutive count M is larger than the threshold count Mth (step S28). Then, if the over-threshold consecutive count M is larger than the threshold count Mth (step S28; Yes), the lesion detection unit 34 determines that there is a lesion part, and therefore outputs the notification indicating that a lesion part is detected by at least one of the display and/or sound output (step S29). On the other hand, if the over-threshold consecutive count M is equal to or less than the threshold count Mth (step S28; No), the process of the flowchart ends.
First, the second score calculation unit 33 calculates the second score S2 based on the variable number of the endoscopic images Ia (step S31). In this case, for example, the second score calculation unit 33 calculates the second score S2 based on the variable number of endoscopic images Ia acquired at the current processing time and the previous processing time(s) or their feature data and the second model configured with reference to the second model information storage unit D2. In addition, the first score calculation unit 32 calculates the first score S1 based on a predetermined number of endoscopic images Ia in parallel with the process at step S31 (step S32). In this case, for example, the first score calculation unit 32 calculates the first score S1 based on the fixed predetermined number of endoscopic images Ia obtained at the current processing time (and the past processing times) or their feature data and the first model configured with reference to the first model information storage unit D1.
After executing the process at step S32, the lesion detection unit 34 determines whether or not the first score S1 is larger than the first score threshold value Sth1 (step S33). Then, if the first score S1 is larger than the first score threshold value Sth1 (step S33; Yes), the lesion detection unit 34 increases the over-threshold consecutive count M by 1 (step S34). It is herein assumed that the initial value of the over-threshold consecutive count M is set to 0. On the other hand, if the first score S1 is equal to or less than the first score threshold value Sth1 (step S33; No), the lesion detection unit 34 sets the over-threshold consecutive count M to 0 which is the initial value (step S35).
Then, after executing the process at step S34 or step S35, the lesion detection unit 34 determines, based on the over-threshold consecutive count M, the second score threshold value Sth2 that is a threshold value to be compared with the second score S2 (step S36). In this instance, for example, the lesion detection unit 34 refers to a previously stored expression or look-up table or the like, and decreases the second score threshold value Sth2 with increase in the over-threshold consecutive count M.
Then, after executing the process at step S31 and the process at step S36, the lesion detection unit 34 determines whether or not the second score S2 is larger than the second score threshold value Sth2 (step S37). Then, if the second score S2 is larger than the second score threshold value Sth2 (step S37; Yes), the lesion detection unit 34 determines that there is a lesion part, and therefore outputs the notification indicating that a lesion part is detected by at least one of the display and/or sound output (step S38). On the other hand, if the second score S2 is equal to or less than the second score threshold value Sth2 (step S37; No), the process returns to step S31.
Next, a description will be given of modifications of the second example embodiment described above. The following modifications may be arbitrarily combined.
In the lesion detection process based on the first model, the lesion detection unit 34 switched the threshold count Mth from the initial value to the relaxed value when the second score S2 exceeded the second score threshold value Sth2. On the other hand, instead of this mode, the lesion detection unit 34 may decrease, continuously or in stages, the threshold count Mth (i.e., may relax a condition for determining that there is a lesion part) with increase in the second score S2.
In this case, for example, correspondence information such as an expression or a look-up table indicating the relation between each possible second score S2 and the threshold count Mth suitable for the each possible second score S2 is stored in advance in the memory 12 or the like. In the lesion detection process based on the first model, the lesion detection unit 34 determines the threshold count Mth, based on the second score S2 and the above-described correspondence information. Even according to this mode, the lesion detection unit 34 sets the threshold count Mth in accordance with the second score S2, which enables the lesion detection unit 34 to accurately detect a lesion.
In the lesion detection process based on the first model, instead of changing the threshold count Mth based on the second score S2, or, in addition to this, the lesion detection unit 34 may change the first score threshold value Sth1 based on the second score S2. In this instance, for example, the lesion detection unit 34 may decrease the first score threshold value Sth1 in stages or continuously with increase in the second score S2. According to this mode, the lesion detection unit 34 can suitably relax the condition for detecting a lesion based on the first model and accurately perform the lesion detection.
If it is determined, in the lesion detection process based on the first model, that a predetermined condition based on the first score S1 is satisfied, the image processing device 1 may start the process of calculating the second score S2 and changing the threshold count Mth.
For example, the image processing device 1 does not perform calculation of the second score S2 by the second score calculation unit 33 after the start of the lesion detection process based on the first model, and when it is determined that the first score S1 exceeds the first score threshold value Sth1, it starts calculating the second score S2 by the second score calculation unit 33, and changes the threshold count Mth (or the first score threshold value Sth1) in accordance with the second score S2 in the same manner as in the above-described example embodiment. On the other hand, after the calculation of the second score S2 by the second score calculation unit 33 is started, the image processing device 1 stops the calculation of the second score S2 by the second score calculation unit 33 again when it is determined that the first score S1 is equal to or less than the first score threshold value Sth1. The above-mentioned “predetermined condition” is not limited to the condition that the first score S1 is larger than the first score threshold value Sth1, and it may be any condition in which the probability that there is a lesion part is determined to become sufficiently high. Examples of such conditions include the condition that the first score S1 is larger than a predetermined threshold value that is smaller than the first score threshold value Sth1, the condition that the increment per unit time of the first score S1 (i.e., the derivative of the first score S1) is equal to or larger than a predetermined value, and the condition that the over-threshold consecutive count M is equal to or larger than a predetermined value.
Further, if the predetermined condition is satisfied and the calculation of the second score S2 is started, the image processing device 1 may calculate the second score S2 back to past processing time(s) and change the threshold count Mth (or the first score threshold value Sth1) based on the second score S2 at the past processing time. In this instance, for example, the image processing device 1 stores the endoscopic image Ia or its feature data obtained at the past processing time in the memory 12 or the like, the second score calculation unit 33 calculates the second score S2 at the past processing time based on the endoscopic image Ia or its feature data, and changes the threshold count Mth (or the first score threshold value Sth1) based on the second score S2.
According to this modification, the image processing device 1 limits the time period for calculating the second score S2 in the lesion detection process based on the first model and can suitably reduce the computational burden.
In the lesion detection process based on the second model, the image processing device 1 may start the process of calculating the first score S1 and changing the second score threshold value Sth2 by the first model when it is determined that a predetermined condition based on the second score S2 is satisfied.
For example, after the start of the lesion detection process based on the second model, the image processing device 1 does not perform calculation of the first score S1 by the first score calculation unit 32 and starts calculating the first score S1 by the first score calculation unit 32 if the second score S2 is larger than a predetermined threshold value (e.g., 0) smaller than the second score threshold value Sth2. Then, the image processing device 1 changes the second score threshold value Sth2 in accordance with the over-threshold consecutive count M in the same manner as in the above-described example embodiment. On the other hand, after the calculation of the first score S1 by the first score calculation unit 32 is started, the image processing device 1 stops the calculation of the first score S1 by the first score calculation unit 32 again if it is determined that the second score S2 is equal to or less than the predetermined threshold. It is noted that the above-mentioned “predetermined condition” is not limited to the condition that the second score S2 is larger than the predetermined threshold, and it may be any condition in which the probability that there is a lesion part is determined to become sufficiently high. Examples of such conditions include the condition that the increment per unit time of the second score S2 (i.e., the derivative of the second score S2) is equal to or larger than a predetermined value.
In addition, if the predetermined condition is satisfied and therefore the calculation of the first score S1 is started, the image processing device 1 may calculate the first score S1 back to the past processing time(s) and change the second score threshold value Sth2 based on the first score S1 at the past processing time. In this instance, for example, the image processing device 1 stores the endoscopic images Ia or their feature data obtained at the past processing times in the memory 12 or the like, and the first score calculation unit 32 calculates the first score S1 at the past processing time based on the endoscopic images Ia or their feature data, and change the second score threshold value Sth2 at the past processing time based on the first score S1. In this instance, the image processing device 1 compares the second score S2 with the second score threshold value Sth2 at every past processing time to determine whether or not there is a lesion part.
According to this modification, the image processing device 1 limits the time period for calculating the second score S2 in the lesion detection process based on the second model, and can suitably reduce the computational burden.
After the examination, the image processing device 1 may process a moving image configured by endoscopic images Ia generated during the endoscopic examination.
The acquisition means 30X is configured to acquire an endoscopic image obtained by photographing an examination target by a photographing unit provided in an endoscope. In this instance, the acquisition means 30X may immediately acquire the endoscopic image generated by the photographing unit, or may acquire at a predetermined timing the endoscopic image previously generated by the photographing unit and stored in the storage device. Examples of the acquisition means 30X include the endoscopic image acquisition unit 30 in the first example embodiment and the second example embodiment.
The variation detection means 311X is configured to detect a degree of variation between the endoscopic images. The selection means 312X is configured to select either one of a first model or a second model based on the degree of variation, the first model being configured to make an inference regarding a lesion of the examination target based on a predetermined number of the endoscopic images, the second model being configured to make an inference regarding the lesion based on a variable number of the endoscopic images. Examples of the variation detection means 311X and the selection means 312X include the variation detection/selection unit 31 in the first example embodiment and the second example embodiment.
The lesion detection means 34X is configured to detect the lesion based on a selection model that is either the first model or the second model selected. Examples of the lesion detection means 34X include the first score calculation unit 32, the second score calculation unit 33, and the lesion detection unit 34 in the first example embodiment and the second example embodiment.
According to the third example embodiment, the image processing device 1X can accurately detect a lesion part included in an endoscopic image.
In the example embodiments described above, the program is stored by any type of a non-transitory computer-readable medium (non-transitory computer readable medium) and can be supplied to a control unit or the like that is a computer. The non-transitory computer-readable medium include any type of a tangible storage medium. Examples of the non-transitory computer readable medium include a magnetic storage medium (e.g., a flexible disk, a magnetic tape, a hard disk drive), a magnetic-optical storage medium (e.g., a magnetic optical disk), CD-ROM (Read Only Memory), CD-R, CD-R/W, a solid-state memory (e.g., a mask ROM, a PROM (Programmable ROM), an EPROM (Erasable PROM), a flash ROM, a RAM (Random Access Memory)). The program may also be provided to the computer by any type of a transitory computer readable medium. Examples of the transitory computer readable medium include an electrical signal, an optical signal, and an electromagnetic wave. The transitory computer readable medium can provide the program to the computer through a wired channel such as wires and optical fibers or a wireless channel.
The whole or a part of the example embodiments described above (including modifications, the same applies hereinafter) can be described as, but not limited to, the following Supplementary Notes.
An image processing device comprising:
The image processing device according to Supplementary Note 1,
The image processing device according to Supplementary Note 1,
The image processing device according to Supplementary Note 1,
The image processing device according to Supplementary Note 1,
The image processing device according to Supplementary Note 1,
The image processing device according to Supplementary Note 6,
The image processing device according to Supplementary Note 6 or 7,
The image processing device according to Supplementary Note 1, further comprising
The image processing device according to Supplementary Note 9,
An image processing method is an image processing method executed by a computer, the image processing method comprising:
A storage medium storing a program executed by a computer, the program causing the computer to:
While the invention has been particularly shown and described with reference to example embodiments thereof, the invention is not limited to these example embodiments. It will be understood by those of ordinary skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the claims. In other words, it is needless to say that the present invention includes various modifications that could be made by a person skilled in the art according to the entire disclosure including the scope of the claims, and the technical philosophy. All Patent and Non-Patent Literatures mentioned in this specification are incorporated by reference in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
PCT/JP2022/037420 | Oct 2022 | WO | international |
This application is a Continuation of U.S. application Ser. No. 18/560,974 filed Nov. 15, 2023, which is a National Stage of International Application No. PCT/JP2023/029834 filed Aug. 18, 2023, claiming priority based on International Application No. PCT/JP2022/037420 filed Oct. 6, 2022, the contents of all of which are incorporated herein by reference, in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 18560974 | Jan 0001 | US |
Child | 18544757 | US |