The present disclosure relates to an image processing apparatus and, more particularly, to an image processing apparatus that executes image processing on a microscopic image of a pathologic specimen, and relates to an image processing method and a computer-readable recording medium.
In the relater art, for the diagnosis on a living tissue specimen including a pathologic specimen, a block specimen obtained by organ harvesting or a specimen obtained by needle biopsy is sliced in the thickness of approximately several microns, and the observation image obtained by enlarging the sliced specimen with a microscope is observed. Transmission observation using an optical microscope is one of the traditional and the most popular observation techniques with low costs of a device and easy handling. In recent years, diagnosis has been conducted by using the image obtained by taking the observation image using an imaging device attached to an optical microscope.
A sliced living tissue specimen (hereinafter referred to as “sliced specimen”) hardly absorbs or scatters light and is almost colorless and transparent. Therefore, typically, a sliced specimen is stained prior to microscopy.
Various dyeing techniques are disclosed, and the total number thereof reaches 100 or more types. Among the staining techniques, hematoxylin-eosin stain (hereinafter referred to as “HE stain”) using two dyes, blue-violet hematoxylin (hereinafter simply referred to as “H”) and red eosin (hereinafter simply referred to as “E”), is normally used for pathologic specimens in particular.
In the clinical practice, when it is difficult to visually recognize a living tissue that is the target to be observed with HE stain or when the morphological diagnosis of a living tissue is interpolated, a technique may be sometimes used to apply a special stain different from the HE stain to a specimen and change the color of the target tissue to be observed so as to visually highlight the target tissue. Further, in histopathological diagnosis, immunostaining (immunohistochemistry: IHC) using various marker proteins for visualizing, for example, antigen-antibody reaction of a cancer tissue is sometimes used.
Observation of a stained specimen is executed by displaying, on a display device, the image generated by capturing the stained specimen with an imaging device as well as by visual recognition. In recent years, there has been the proposed attempt to execute image processing on a stained specimen image generated by capturing with an imaging device and conduct analysis so as to support the observation and diagnosis by a doctor, etc. For this analysis, there is a technique using learning such as deep learning. In this case, the calculated parameter is obtained by learning the pair including an analysis value corresponding to the RGB value of an input image.
However, in the observation of a stained specimen, even the tissues in the same condition may have different color shades because of a difference in the color due to the capturing state with regard to the stained specimen, a difference in the color due to a dyeing process, for example, a difference in the spectrum of a dye, or a difference in the staining time. In deep learning, or the like, when the color shade of an input image is different from the color shade of a learning image, the inference accuracy is degraded. For this reason, in deep learning, or the like, it is possible to deal with a larger number of color shades of a learning image; however, it is not practical as an enormous number of images are required under various conditions. Therefore, there is a known technique for performing color equalization to correct different color shades to the identical color shade (see Japanese Patent No. 5137481).
In some embodiments, an image processing apparatus includes a processor comprising hardware. The processor is configured to: calculate a hue of each pixel of a stained image that is input from outside; execute classification on each pixel of the stained image based on the hue; modulate a color tone of the pixel of the stained image in each class having undergone the classification; combine a plurality of input images to generate a combined image; calculate a color distribution of each pixel of the combined image; execute classification on each pixel of the combined image by using the color distribution; and calculate an average hue of each class having undergone the classification on each pixel of the combined image based on the color distribution and a classification result of classification of the combined image so as to calculate a standard hue.
In some embodiments, an image processing apparatus includes a processor comprising hardware. The processor is configured to: calculate a hue of each pixel of a stained image that is input from outside; execute classification on each pixel of the stained image based on the hue; modulate a color tone of the pixel of the stained image in each class having undergone the classification; rotate a hue of an input image associated with a correct value at different rotation angles to generate a plurality of images having different hues; generate, based on the plurality of images and on a learning result stored in a storage, a plurality of output images; combine hue ranges of images whose error between the plurality of output images and the correct value falls within an allowable range to calculate a color distribution; and calculate an average hue of each class having undergone classification on each pixel of the input image by using the color distribution so as to calculate a standard hue.
In some embodiments, provide is an image processing method implemented by an image processing apparatus. The image processing method includes: calculating a hue of each pixel of a stained image that is input from outside; executing classification on each pixel of the stained image based on the hue; modulating a color tone of the pixel of the stained image in each class having undergone the classification; combining a plurality of input images to generate a combined image; calculating a color distribution of each pixel of the combined image; executing classification on each pixel of the combined image by using the color distribution; and calculating an average hue of each class having undergone the classification on each pixel of the combined image based on the color distribution and a classification result of classification of the combined image so as to calculate a standard hue.
In some embodiments, provided is a non-transitory computer-readable recording medium with an executable program stored thereon. The program causes an image processing apparatus to execute: calculating a hue of each pixel of a stained image that is input from outside; executing classification on each pixel of the stained image based on the hue; modulating a color tone of the pixel of the stained image in each class having undergone the classification; combining a plurality of input images to generate a combined image; calculating a color distribution of each pixel of the combined image; executing classification on each pixel of the combined image by using the color distribution; and calculating an average hue of each class having undergone the classification on each pixel of the combined image based on the color distribution and a classification result of classification of the combined image so as to calculate a standard hue.
In some embodiments, provided is an image processing method implemented by an image processing apparatus. The image processing method includes: calculating a hue of each pixel of a stained image that is input from outside; executing classification on each pixel of the stained image based on the hue; modulating a color tone of the pixel of the stained image in each class having undergone the classification; rotating a hue of an input image associated with a correct value at different rotation angles to generate a plurality of images having different hues; generating, based on the plurality of images and on a learning result stored in a storage, a plurality of output images; combining hue ranges of images whose error between the plurality of output images and the correct value falls within an allowable range to calculate a color distribution; and calculating an average hue of each class having undergone classification on each pixel of the input image by using the color distribution so as to calculate a standard hue.
In some embodiments, provided is a non-transitory computer-readable recording medium with an executable program stored thereon. The program causes an image processing apparatus to execute: calculating a hue of each pixel of a stained image that is input from outside; executing classification on each pixel of the stained image based on the hue; modulating a color tone of the pixel of the stained image in each class having undergone the classification; rotating a hue of an input image associated with a correct value at different rotation angles to generate a plurality of images having different hues; generating, based on the plurality of images and on a learning result stored in a storage, a plurality of output images; combining hue ranges of images whose error between the plurality of output images and the correct value falls within an allowable range to calculate a color distribution; and calculating an average hue of each class having undergone classification on each pixel of the input image by using the color distribution so as to calculate a standard hue.
The above and other features, advantages and technical and industrial significance of this disclosure will be better understood by reading the following detailed description of presently preferred embodiments of the disclosure, when considered in connection with the accompanying drawings.
An image processing apparatus, an image processing method, and a program according to embodiments of the present disclosure are described below with reference to the drawings. The present disclosure is not limited to the embodiments. In the descriptions of the drawings, the same parts are denoted by the same reference numeral.
Configuration of Image Processing Apparatus
Hereinafter, a stained image is an image obtained by capturing a specimen that is stained by using, for example, HE stain, Masson's trichrome stain, Papanicolaou stain, or immunostaining. HE stain is used for typical tissue morphological observation to stain a nucleus in blue violet (hematoxylin) and cytoplasm in pink (eosin). Masson's trichrome stain is to stain a collagen fiber in blue (aniline blue), a nucleus in black violet, and cytoplasm in red. Papanicolaou stain is used for cell examination to stain cytoplasm in orange, light green, or the like, depending on the degree of differentiation. Immunostaining is used for immune antibody reaction to stain specific tissues. Specifically, immunostaining causes the antibody to combine with the DAB dye to stain a nucleus with hematoxylin. In the description according to the embodiments below, the input image is the image obtained by capturing a specimen that is stained by immunostaining; however, changes may be made as appropriate depending on a staining technique.
The image processing apparatus 1 illustrated in
The input unit 10 receives the learning data in which an input image, input from outside the image processing apparatus 1, is associated with a correct value. The input unit 10 outputs an input image (training image) included in the learning data to the calculator 11 and outputs a correct value to the learning unit 14. The input unit 10 is configured by using, for example, an interface module capable of communicating bi-directionally with the outside.
The calculator 11 calculates the hue of the input image that is input from the input unit 10 in each pixel of the input image, and outputs the calculated hue of the input image in each pixel of the input image and the input image that is input from the input unit 10, to the classifier 12. The calculator 11 may divide the input image into predetermined regions and calculate the hue of the input image in each divided region.
The classifier 12 executes classification on each pixel or predetermined region of the input image, input from the calculator 11, based on the hue of the input image in each pixel input from the calculator 11 and outputs the classification result and the input image that is input from the calculator 11, to the modulator 13.
The modulator 13 modulates the color tone of the pixel of the input image in each class, which has undergone the classification and input from the classifier 12, and outputs the modulation result to the learning unit 14. Specifically, based on a reference hue parameter in the storage unit 15 described below, the modulator 13 modulates the hue of each image in each class, which has undergone the classification and input from the classifier 12, and outputs the input image with the modulated hue to the learning unit 14.
The learning unit 14 executes machine learning such as regression analysis or a neural network based on the input image with the modulated hue, input from the modulator 13, and on the correct value associated with the input image and stores the learning result in a learning-result storage unit 151 of the storage unit 15. The targets for learning by the learning unit 14 are various, including for example the one for estimating the amount of dye, the one for executing tissue classification, and the one for determining the grade of a disease state (lesion). The correct value is the image having the quantitative values corresponding to the number of dyes for each pixel in the case of the amount of dye, is the class number assigned to each pixel in the case of tissue distribution, and is the value indicating a single grade and assigned to a single image in the case of the grade of a disease state.
The storage unit 15 is configured by using a volatile memory, a non-volatile memory, a memory card, or the like. The storage unit 15 includes the learning-result storage unit 151, a reference-hue parameter storage unit 152, and a program storage unit 153. The learning-result storage unit 151 stores a learning result obtained by learning of the learning unit 14. The reference-hue parameter storage unit 152 stores the reference hue parameter that is referred to when the modulator 13 modulates the hue of a training image. The program storage unit 153 stores various programs executed by the image processing apparatus 1 and various types of data used during the execution of a program.
The image processing apparatus 1 having the above configuration is configured by using, for example, a central processing unit (CPU), a graphics processing unit (GPU), a field programmable gate array (FPGA), or a digital signal processing (DSP) to read various programs from the program storage unit 153 of the storage unit 15 and send an instruction or data to each unit included in the image processing apparatus 1 so as to perform each function.
Process of Image Processing Apparatus
Next, a process performed by the image processing apparatus 1 is described.
As illustrated in
Then, the calculator 11 calculates the hue of each pixel of the input image, input from the input unit 10 (Step S102). Specifically, the calculator 11 calculates the hue of each pixel of the input image and outputs the calculation result to the classifier 12.
Then, the classifier 12 executes classification on each pixel of the input image based on the hue of each pixel of the input image calculated by the calculator 11 (Step S103). Specifically, the classifier 12 classifies each pixel of the input image into a DAB pixel, an H pixel, or other pixels based on the hue calculated by the calculator 11 and outputs a result of the classification to the modulator 13.
Subsequently, the modulator 13 modulates, based on the reference hue parameter, the hue of the pixel of the input image in each classification input from the classifier 12 (Step S104). Specifically, the modulator 13 executes hue modulation on a DAB pixel and an H pixel, which have undergone the classification and input from the classifier 12, based on the reference hue parameter stored in the reference-hue parameter storage unit 152 and does not execute hue modulation on other pixels. After Step S104, the image processing apparatus 1 proceeds to Step S105 described below.
Here, the details of a hue modulation process executed by the modulator 13 is described.
As illustrated in
With reference back to
The learning unit 14 executes learning by using the pair of the training image with the modulated hue, input from the modulator 13, and the correct value input from the input unit 10 (Step S105) and outputs a learning parameter that is a learning result to the learning-result storage unit 151 (Step S106). After Step S106, the image processing apparatus 1 ends this process.
According to the first embodiment described above, the hue of the input image is modulated so as to match the color shade; therefore, even when there are color variations due to a difference in stains, there is no need to execute learning of an input image for each stain, and different learning images may be acquired in a simple process, which enables effective learning.
Next, a second embodiment of the present disclosure is described. According to the second embodiment, after the hue of an input image is modulated, inference is executed by using a learning result. After the configuration of an image processing apparatus according to the second embodiment is described, a process performed by the image processing apparatus according to the second embodiment is described below. The same components as those of the image processing apparatus 1 according to the first embodiment described above are denoted by the same reference numeral, and the detailed description is omitted.
Configuration of Image Processing Apparatus
The inference unit 16 executes inference based on a learning result stored in the learning-result storage unit 151 and a training image input from the modulator 13 and outputs the inference result to the output unit 17.
The output unit 17 outputs the inference result input from the inference unit 16. The output unit 17 is configured by using, for example, a display panel of a liquid crystal or an organic electro luminescence (EL) or a speaker. It is obvious that the output unit 17 may be configured by using an output interface module that outputs an inference result to an external display device, etc.
Process of Image Processing Apparatus
Next, a process performed by the image processing apparatus 1A is described.
At Step S205, the inference unit 16 applies a learning parameter, which is a learning result stored in the learning-result storage unit 151, to the modulated training image input from the modulator 13 to execute inference. In this case, the inference unit 16 outputs the inference result (inference value) to the output unit 17.
Subsequently, the output unit 17 outputs the inference value input from the inference unit 16 (Step S206).
According to the second embodiment described above, as the hue of the input training image is modulated so as to match the color shade, it is possible to input the image having the same color shade as that used for learning, which enables high-accuracy inference.
Next, a third embodiment of the present disclosure is described. According to the third embodiment, learning is executed by selectively using hue rotation and fixing for each class. After a configuration of an image processing apparatus according to the third embodiment is described, a process performed by the image processing apparatus according to the third embodiment is described below. The same components as those of the image processing apparatus 1A according to the second embodiment described above are denoted by the same reference numeral, and the detailed description is omitted.
Configuration of Image Processing Apparatus
The selector 131 selects and determines the method for modulating a hue for each class input from the classifier 12 and outputs the determination result, the input image, and the classification result to the processing unit 132.
With regard to the input image that is input from the selector 131, the processing unit 132 modulates the hue of each class by using the modulation method selected by the selector 131 for each class and outputs the input image with the modulated hue to the inference unit 16.
Process of Image Processing Apparatus
Next, a process performed by the image processing apparatus 1B is described.
At Step S304, the selector 131 selects the method for modulating the hue for each class input from the classifier 12. Specifically, when two dyes, DAB and H are used, and a classification is made into each dye, a DAB class and an H class, the selector 131 selects the modulation method to rotate the hue so as to leave the original distribution as DAB needs a fixed value. The selector 131 selects the method for modulating the hue by using the fixed value for the hue as the shape identification is only necessary for H.
Subsequently, with regard to the input image that is input from the selector 131, the processing unit 132 modulates the hue of each class by using the modulation method selected by the selector 131 for each class and outputs the input image with the modulated hue to the inference unit 16 (Step S305). After Step S305, the image processing apparatus 1B proceeds to Step S306.
The details of a hue modulation process performed by the processing unit 132 are described.
As illustrated in
According to the third embodiment described above, as the hue distribution after the setting of the fixed value of the hue is a linear distribution having the same value in each class, it is possible to input the image having the same color shade as that used for learning, which enables high-accuracy inference.
Next, a fourth embodiment of the present disclosure is described. According to the fourth embodiment, hue modulation is executed so that different color shades of the images are changed to the same color shade for observation. The same components as those of the image processing apparatus 1A according to the second embodiment described above are denoted by the same reference numeral, and detailed description is omitted.
Configuration of Image Processing Apparatus
The display unit 18 displays the information and the image corresponding to various types of data output from the inference unit 16. The display unit 18 is configured by using a liquid crystal, an organic EL, or the like.
Process of Image Processing Apparatus
At Step S404, the modulator 13 executes hue modulation on input images such that the different color shades of the images are changed into the same color shade. After Step S404, the image processing apparatus 1C proceeds to Step S405.
As illustrated in
As illustrated in
According to the fourth embodiment described above, as the display unit 18 displays the specimen images, which have different color shades, in the same color shade, it is possible to simply observe and compare only the differences in the state of a cell and a tissue.
Next, a fifth embodiment of the present disclosure is described. An image processing apparatus according to the fifth embodiment is different from the image processing apparatus according to the second embodiment described above in the configuration and the process performed. Specifically, according to the fifth embodiment, the standard hue is calculated. After a configuration of the image processing apparatus according to the fifth embodiment is described, the process performed by the image processing apparatus according to the fifth embodiment is described below.
Configuration of Image Processing Apparatus
The standard-hue calculator 19 calculates the color distribution of the prepared images for calculating a standard value to calculate a standard distribution.
Process of Image Processing Apparatus
Next, a process performed by the image processing apparatus 1D is described.
As illustrated in
Then, the standard-hue calculator 19 calculates the color distributions of the images input from the input unit 10 to calculate the standard distribution (Step S502) and outputs the calculated standard distribution to the reference-hue parameter storage unit 152 of the storage unit 15 (Step 3503). After Step S503, the image processing apparatus 1D ends this process.
Here, the method for calculating the standard distribution by the standard-hue calculator 19 is described.
As illustrated in
According to the fifth embodiment described above, the standard-hue calculator 19 sets, in the standard distribution, the average of the hue in the distribution regarded as DAB as the DAB average hue and the average of the hue in the distribution regarded as H as the H average hue so as to calculate the standard hue (the standard hue parameter).
Next, a sixth embodiment of the present disclosure is described. An image processing apparatus according to the sixth embodiment is the same as that in the fifth embodiment described above in the configuration and is different in the process performed by the image processing apparatus. Specifically, according to the sixth embodiment, the color is modulated as appropriate for the trained learning unit. A learning method implemented by the learning unit included in the image processing apparatus according to the sixth embodiment is described below. The same components as those in the fifth embodiment described above are denoted by the same reference numeral, and the detailed description is omitted.
Learning Process by Learning Unit
As the image used for learning is unknown, the learning unit 14 calculates the parameter for setting the appropriate color shade as described below. As illustrated in
According to the sixth embodiment described above, it is possible to also input the image having the appropriately modulated color shade to the existing learning unit (learning device).
The components described in the first embodiment to the sixth embodiment described above may be combined as appropriate to form various embodiments. For example, some components may be deleted from all the components described in the first embodiment to the sixth embodiment described above. Furthermore, the components described in the first embodiment to the fifth embodiment described above may be combined as appropriate.
In the first embodiment to the sixth embodiment, the “unit” described above may be replaced with a “means”, a “circuitry”, or the like. For example, the input unit may be replaced with an input means or an input circuitry.
A program to be executed by the image processing apparatuses according to the first embodiment to the sixth embodiment is provided by being recorded in a computer-readable recording medium such as a CD-ROM, a flexible disk (FD), a CD-R, a digital versatile disk (DVD), a USB medium, or a flash memory, in the form of file data installable or executable.
A configuration may be such that a program to be executed by the image processing apparatuses according to the first embodiment to the sixth embodiment is provided by being stored on a computer connected via a network, such as the Internet, and downloaded via the network. A program to be executed by the image processing apparatuses according to the first embodiment to the sixth embodiment may be provided or distributed via a network such as the Internet.
Although an input image is received from various devices via, for example, a transmission cable according to the first embodiment to the sixth embodiment, it does not need to be for example wired, and it may be wireless. In this case, a signal may be transmitted from each device in accordance with a predetermined wireless communication standard (e.g., Wi-Fi (registered trademark) or Bluetooth (registered trademark)). It is obvious that wireless communication may be executed in accordance with a different wireless communication standard.
In the flowcharts described in this description, the expressions such as “first”, “then”, and “subsequently” are used to indicate the order of processes at steps; however, the order of processes necessary to implement the present disclosure is not uniquely defined by the expressions. That is, the order of processes in the flowcharts in this description may be changed as long as there is no contradiction.
According to the present disclosure, there is an advantage such that it is possible to acquire different learning images in a simple process.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the disclosure in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
This application is a continuation of International Application No. PCT/JP2018/022625, filed on Jun. 13, 2018, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2018/022625 | Jun 2018 | US |
Child | 17117338 | US |