1. Field of the Invention
The present invention relates to an image processing method and an image processing system.
2. Description of the Related Art
A virtual slide system, which images a sample on a slide using a digital microscope, acquires a virtual slide image (hereafter called “slide image”), and displays this image on a monitor for observation is receiving attention (see Japanese Patent Application Laid-open No. 2011-118107).
A pathological image system technique, for managing and displaying a gross image (digital image of a lesion area) and a slide image (microscopic digital image) separately and linking these images, is also known (see Japanese Patent Application Laid-open No. 2000-276545).
Based on the pathological image system technique disclosed in Japanese Patent Application Laid-open No. 2000-276545, the gross image and the slide image can be managed and displayed as linked with each other, but the area of the gross image that corresponds to the slide image, or the correspondence of the lesion in the gross image and in the slide image, cannot be recognized.
With the foregoing in view, it is an object of the present invention to provide a technique that allows visually and intuitively recognizing the correspondence of information acquired from a gross organ and information acquired from a plurality of samples collected from the gross organ.
The present invention in its first aspect provides an image processing method, comprising: acquiring data on a plurality of sample images, acquired by imaging a plurality of samples collected from different positions of a gross organ that includes a lesion, by a computer; extracting information on the lesion from each of the plurality of sample images; and generating data on a pathological information image by combining information on the lesion extracted from each of the plurality of sample images, on an image expressing the gross organ, by the computer.
The present invention in its second aspect provides an image processing system, comprising: an acquiring unit configured to acquire data on a plurality of sample images, acquired by imaging a plurality of samples collected from different positions of a gross organ that includes a lesion; an information extracting unit configured to extract information on a lesion from each of the plurality of sample images; and a data generating unit configured to generate data on a pathological information image by combining information on the lesion extracted from each of the plurality of sample images, on an image expressing the gross organ.
The present invention in its third aspect provides a non-transitory computer-readable storage medium that records a program for a computer to execute each step of the image processing method according to the present invention.
According to the present invention, it is possible to generate an image (a pathological information image) that allows visually and intuitively recognizing the correspondence of information acquired from a gross organ and information acquired from a plurality of samples collected from the gross organ.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
The present invention relates to a technique to generate an image which is effective for pathological diagnosis from a plurality of sample images captured by a digital microscope or the like. In concrete terms, information on a lesion is extracted from a plurality of sample images collected from different positions of a gross organ (all or part of an internal organ), and data on the pathological information image is generated by combining the extracted information on an image expressing the gross organ. By displaying this pathological information image, the correspondence of information acquired from a plurality of samples collected from the gross organ and information acquired from the gross organ can be visually and intuitively recognized. When information on a lesion extracted from each sample image is combined at a corresponding position on the image expressing the gross organ at this time, so that the correspondence of the gross organ and the position from which each sample was collected can be recognized, then the position, range, progress or the like of the lesion in the gross organ can be accurately and intuitively recognized.
Now the preferred embodiments of the present invention will be described with reference to the drawings.
The image processing method of the present invention can be used in the pathological diagnostic processing steps. These pathological diagnostic processing steps will be described with reference to
Now, how the pathological diagnostic processing steps described in
In step S201, a stomach (internal organ) is excised and extirpated. The excised range is determined by comprehensively judging the location and stage of the lesion, the age and medical history of the patient or the like. The typical excision ranges of a stomach will be described in
In step S202, the sample is treated and fixed. The stomach excised and extirpated in step S201 is saved in a diluted formalin solution to fix the organ. Fixing prevents tissue from being degenerated, stabilizes its shape and structure, strengthens its stainability and maintains its antigenicity. This step S202 corresponds to
In step S203, extirpation is performed. The lesion area is extirpated based on the judgment of the pathologist. Not only a lesion area that can be visually recognized, but also an area where lesions tend to occur is extirpated. The gross organ and the sample blocks are imaged before and after extirpation, so as to confirm the correspondence of “the gross organ as macro-information” and “the slides as micro-information”. The gross image before extirpation corresponds to the image in
Slides are created in step S204. The slides are created from the sample blocks via such steps as drying, paraffin embedding, slicing, staining, sealing, labeling and segment checking. Hematoxylin-eosin (HE) stained slides are created for biopsy.
The gross organ shown in
The pathologist selects the thin sliced surface 601 when extirpation is performed. This thin sliced surface 601 is an XZ cross-section. This is because the invasion depth of the lesion in the thickness direction (Z direction) of stomach walls is determined in the pathological diagnosis. The cross-sections shown in
The “micro-pathological information extracted from the discretely sampled slides” and the “macro-pathological information extracted from the gross organ” shown in
The image processing method of this example will be described with reference to
(0) General Flow
In step S801, the pathological information is extracted from the gross image. Details will be described in
In step S802, the pathological information is extracted from the slide images. Details will be described in
In step S803, the gross image and the slide images are aligned. Details will be described in
In step S804, the pathological information image data is generated and displayed. Details will be described in
(1) Step S801: Pathological Information Extraction from Gross Image
As shown in
First the flow of the pathological information extraction from the gross image will be described with reference to
In step S1001, the gross body image is acquired. In this processing, the computer reads data on the gross image shown in
In step S1002, the lesion area is extracted. For example, the user observes the gross organ (actual organ) or the gross image, and specifies the area of the lesion 408. Then using such an operation device as a mouse, the user specifies the lesion area for the gross image or the CG of the gross organ displayed on the monitor screen of the computer. As mentioned above, the computer may automatically extract and set the lesion area based on the image analysis. The lesion area extracted in this step is called an “extracted lesion area”.
In step S1003, the extirpation area is specified. There are two methods to specify extirpation areas: a user specification (manual specification), and a computer specification (automatic specification). If the user specifies the extirpation area, the user specifies the extirpation area in the gross image or the CG of the gross organ displayed on the monitor screen of the computer using such an operation device as a mouse. The computer specification (automatic specification) will now be described with reference to
In step S1004, the lesion area and the divided cells are corresponded. First the mesh-division range 902 in each extirpation area is divided into 5 cells. Then it is determined whether each divided cell overlaps with the area of the lesion 408, and a divided cell that includes the lesion 408 is regarded as a “lesion area”. For example, the lesion area and the divided cells can be corresponded to each other by attaching a “1” flag to a lesion-area divided cell, and attaching a “0” flag to a non-lesion-area divided cell. The area constituted by a set of divided cells to which the “1” flag set is attached is the above mentioned macro-lesion area 901. By this step, the area of the lesion 408 extracted from the gross image, the mesh-divided extirpation area, the macro-lesion area 901 and the positional relationship thereof are stored as the gross pathological information.
In step S1005, the extirpation dimension is acquired. The extirpation dimension is determined with regard to the thin sliced sample, that is later mounted on a slide. In the case of
In step S1006, the extirpation area, other than the extracted lesion area, extracted in step S1002, is specified. Lesions of a stomach often occur in the lessor curvature, hence in this example, the lesser curvature area is specified as the extirpation area, besides the extracted lesion area. To specify the extirpation area in this step, either the manual specification by the user or the automatic specification by the computer can be used. In the case of the manual specification, a desired area on the gross image can be specified using such an operation device as a mouse, for example. In the case of the automatic specification, an area where a lesion easily occurs (e.g. lesser curvature) can be detected based on the image analysis.
In step S1007, the extirpation area is mapped. The extirpation area is mapped so as to include the target area constituted by the lesion area extracted in step S1002 and the area specified in S1006. The mapping can be implemented using a simple algorithm, such as determining a rectangular area in which a target area is inscribed, and arranging the extirpation area such that this rectangular area is included. The user may adjust the position of the extirpation area after automatic mapping is performed by the computer.
In step S1008, a number is assigned to the extirpation area. A number is assigned to each extirpation area so that the positional relationship between the slides created later and the gross image can be recognized. In this example, one slide is created for each extirpation area, hence numbers in a series are assigned to the 21 extirpation areas respectively (see
(2) Step S802: Pathological Information Extraction from Slide Image
The “position” described in
The slide pathological information in this example is information that includes an area of the lesion 1103, mesh-divided slide image, micro-lesion area 1205, and invasion depth of the thin sliced sample 1102. The area of the lesion 1103 can be expressed in a mask image, for example. The “mesh-divided slide image” refers to an image inside the mesh-division range 1204 shown in
First the flow of pathological information extraction from the slide image will be described with reference to
In step S1301, slide images are acquired. This step is, for example, a processing operation where the computer reads data of a plurality of slide images from a storage device shown in
In step S1302, a lesion area and an invasion depth are extracted. There are two methods to extract the lesion area and invasion depth: extraction by the user (manual extraction), and extraction with computer assistance (semi-automatic extraction). If the user extracts the lesion 1103 from the slide image, the user specifies the recognized range of the lesion 1103 in the slide image displayed on the monitor screen of the computer using such an operation device as a mouse. The extraction with computer assistance (semi-automatic extraction) will be described with reference to
In step S1303, sample reference points are extracted. The sample reference points are: center, left end, right end or the like points of the thin sliced sample 1102 in the X direction. This step corresponds to
In step S1304, it is determined whether the processing operations from steps S1301 to S1303 have been executed for all slide images. If the processing operations are completed for all slide images, processing advances to step S1305.
In step S1305, the slide images are aligned. Each slide image is aligned in the X direction using the sample reference points extracted in step S1303. This step corresponds to
In step S1306, the lesion area and divided cells are corresponded. First the mesh-division range 1204 in each slide image is divided into 5 cells. Then it is determined whether each divided cell overlaps with the lesion 1103, and a divided cell that includes the lesion 1103 is regarded as a micro-lesion area. For example, the lesion area and the divided cells can be corresponded by attached a “1” flag to a divided cell which is a micro-lesion area, and attaching a “0” flag to a divided cell which is a non-micro-lesion area. By this step, the area of the lesion extracted from the slide images, the mesh-divided slide images, the micro-lesion area 1205 and the invasion depth of the thin sliced samples are stored as the slide pathological information.
In step S1307, the sample area is extracted. The area of the thin sliced sample 1102 is extracted from the slide image. The area can be extracted using a simple algorithm, such as binarizing the image after adjusting the histogram.
In step S1308, reference tissues (mucosa-fixing layer, sub-mucosa, lamina propria, sub-serosa and serosa) are specified. To determine the invasion depth according to
In step S1309, a hematoxylin area (nucleus) is extracted. In biopsy, the thin sliced sample 1102 is stained by hematoxylin-eosin (HE). Hematoxylin is a bluish-purple dye used to stain the nucleus of a cell or the like, and eosin is a pink dye used to stain cytoplasm or the like. In this step, the hematoxylin area (nucleus) that is stained bluish-purple is extracted using the color information of the slide image data.
In step S1310, a feature value is extracted by structure recognition. For the structure recognition, an algorithm that applies graph theory can be used. Based on the information on the nucleus extracted in step S1309, a Voronoi diagram, a Delaunay diagram, a minimum spanning tree or the like are drawn. For example, in the case of the Voronoi diagram, an average, a standard deviation and a minimum-maximum ratio are determined for the area, the perimeter and the length of one side of the polygon (closed area) respectively, and the determined values are regarded as the feature values (9 values). In the case of the Delaunay diagram, an average, a standard deviation and a minimum-maximum ratio are determined for the area and perimeter of the triangle (closed area) respectively, and the determined values are regarded as the feature values (6 values). In the case of the minimum spanning tree, the minimum spanning tree is determined by weighting according to the length of the side, and the average, standard deviation and minimum-maximum ratio of the sides of the minimum spanning tree are determined and regarded as the feature values (3 values).
In step S1311, the lesion area is extracted. The lesion area is extracted based on the plurality of features values extracted in step S1310. A structure of a benign tissue and a structure of a malignant tissue have a difference that can be visually recognized, and whether the tissue is benign or malignant and the degree of malignancy can be determined using a plurality of feature values. In other words, the lesion area can be extracted using a plurality of feature values acquired from the slide images. If in step S1310 the feature values are acquired not only from a Voronoi diagram but also from a Delaunay diagram or a minimum spanning tree, or from slide images filtered by a Gabor filter or the like, comprehensive criteria of the lesion area can be created by combining these feature values. The criteria of the feature values that reflect the characteristics of the tissue may be created for each reference tissue (mucosa-fixing layer, sub-mucosa, lamina propria, sub-serosa and serosa).
In step S1312, the invasion depth is determined. The invasion depth (infiltration degree) is determined by the layer of the reference tissue (mucosa-fixing layer, sub-mucosa, lamina propria, sub-serosa and serosa) specified in step S1308, into which the lesion area, determined in step S1311, infiltrated.
(3) Step S803: Alignment of Gross Image and Slide Images
The macro-lesion area and the micro-lesion area will be described with reference to
In step S1601, the slide image data is mapped on the gross image data. Information required for correspondence and alignment between the gross image and the slide images is acquired from the gross pathological information and the slide pathological information. This step corresponds to
In step S1602, the lesion area and invasion depth are mapped on the gross image data. Information on the lesion area in the gross image and information on the lesion area and invasion depth of the slide images are acquired from the gross pathological information and the slide pathological information respectively. The lesion area includes the macro-lesion area 901 and the micro-lesion area 1205, and the respective lesion areas may not match in some cases, as shown in
(4) Step S804: Generation and Display of Pathological Information Image Data
Besides the display methods described here, the micro-lesion area 901 and the micro-lesion area 1205 may be switchable in the display of the lesion area, or the micro-lesion area 1205 may be displayed with priority. To express the invasion depth, gradation may be used in addition to using pseudo-colors and contour lines.
The possible data display formats are, for example: 2D digital data, 3D digital data and CG data. The 2D digital data is a display format where the lesion area 1701 and the invasion depth display area 1702 are combined with two-dimensional gross image data. The 3D digital data is a display format where the lesion area 1701 and the invasion depth display area 1702 are combined with three-dimensional gross image data. The CG data is a display format where the gross image is created by CG, and the lesion area 1701 and the invasion depth display area 1702 are combined with CG data. Either two-dimensional gross CG or three-dimensional gross CG may be used.
In step S1801, the pathological information and the alignment information are acquired. The pathological information refers to the gross pathological information and the slide pathological information. The gross pathological information includes information on the area of the lesion 408 extracted from the gross image, mesh-divided extirpation area, macro-lesion area 901 and positional relationships thereof. The slide pathological information includes information on the area of the lesion 1103, mesh-divided slide images, micro-lesion area 1205, and invasion depth of the thin sliced samples 1102. The alignment information is information to correspond the positional relationships of the macro-lesion areas and the micro-lesion areas.
In step S1802, a lesion area display method is selected. The lesion area display method is, for example, displaying the edge as a rectangle or an ellipse. In step S1803, an invasion depth display method is selected. The invasion depth display method is, for example, a continuous display or a discrete display, a color display or a contour line display. For example, a display method setting GUI is displayed on the monitor screen, and the user selects a desired display method using such an operation device as a mouse.
In step S1804, the pathological information image data is generated. The pathological information image data that is displayed is generated using the gross pathological information, slide pathological information, alignment information, pathological area display method and invasion depth display method.
In step S1805, the pathological information image data generated in S1804 is displayed.
According to the image processing method of this example, an image processing method that allows intuitively recognizing the correspondence of the pathological information and the clinical information can be provided. In the pathological diagnosis, the lesion area and invasion depth in the gross organ in its entirety are recognized by integrating the micro-pathological information extracted from the discretely sampled slides and the macro-pathological information extracted from the gross organ. The lesion area and invasion depth thereof are important information not only for the pathologist and the clinician, but also for the patient, in order to judge the stage of the illness and determine the treatment plan. By visualizing the correspondence between the micro-pathological information and the macro-pathological information in such a way that intuitive recognition is possible, the user (pathologist) can more accurately and quickly transfer the pathological information, including the lesion area and invasion depth, to the clinician and the patient. Thereby inconsistency in the information transfer can be decreased, and information can be transferred more efficiently.
An example of an image processing system to execute the above mentioned image processing method will be described with reference to
The imaging apparatus 1901 is a virtual slide scanner which has a function to image an object at high magnification, and output a high resolution digital image. To acquire the two-dimensional image, a solid state image sensing device, such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS), is used. The imaging apparatus 1901 may be constituted by a digital microscopic apparatus which has a digital camera housed in the eye piece of a standard optical microscope, instead of the virtual slide scanner.
The image processor 1902 has a function to generate data to-be-displayed on the display device 1903 from a plurality of original image data acquired from the imaging apparatus 1901 according to the request from the user. The image processor 1902 is constituted by a general purpose computer or a workstation which includes such hardware resources as a central processing unit (CPU), memory (RAM), storage device and operation device. The storage device is a large capacity information storage device, such as a hard disk drive, which stores programs, data, the operating system (OS) or the like, to implement the above mentioned image processing method. These functions are implemented by the CPU that loads from the storage device the programs and data required for a memory, and executes the programs. The operation device is constituted by a keyboard, mouse or the like, and is used for the user to input various instructions.
The display device 1903 is a monitor to display the gross image, slide images and gross pathological information, slide pathological information and pathological information images (
The data server 1904 is a mass storage device storing such data as gross images, slide images, gross pathological information, slide pathological information and pathological information images.
In the case of
In
The slide image data acquiring unit 2201 acquires slide image data from the storage device. If the slide image data is stored in the data server 1904, the slide image data is acquired from the data server 1904.
The slide image pathological information extracting unit 2202 extracts the slide pathological information from the slide image data, and stores the slide image data and the slide pathological information in the memory (see description on
The gross image data acquiring unit 2203 acquires the gross image data from the data server 1904.
The gross image pathological information extracting unit 2204 extracts the gross pathological information from the gross image data, and stores the gross image data and the gross pathological information in the memory (see the description on
The user input information acquiring unit 2205 acquires various instruction content inputted by the user using such an operation device as a mouse. For example, lesion area extraction in the gross image (S1002), extirpation area specification (S1003, S1006), extirpation dimension specification (S1005), lesion specification in the slide image (S1302), reference tissue specification in the slide image (S1308) or the like are inputted.
The alignment unit 2206 reads the gross image data and the slide image data from the memory, and aligns the gross image and the slide image (see the description on
The display image data generating unit 2207 generates the pathological information image data according to the lesion area display method (S1802) or the invasion depth display method (S1803), which were inputted to the user input information acquiring unit 2205 (see the description on
The display image data transfer unit 2208 transfers the image data generated by the display image data generating unit 2207 to the graphics board. High-speed image data transfer between the memory and the graphics board is executed by the DMA function. The image data transferred to the graphics board is displayed on the display device 1903.
According to the image processing system of this example, an image processing method that allows intuitively recognizing the correspondence of the pathological information and the clinical information can be provided. In the pathological diagnosis, the lesion area and invasion depth in the gross organ in its entirety are recognized by integrating the micro-pathological information extracted from the discretely sampled slides, and the macro-pathological information extracted from the gross organ. The lesion area and invasion depth thereof are important information not only for the pathologist and the clinician, but also for the patient, in order to judge the stage of the illness and determine the treatment plan. By visualizing the correspondence between the micro-pathological information and the macro-pathological information in such a way that intuitive recognition is possible, the user (pathologist) can more accurately and quickly transfer the pathological information, including the lesion area and invasion depth, to the clinician and the patient. Thereby inconsistency in the information transfer can be decreased, and information can be transferred more efficiently.
This example is one preferred embodiment of the present invention, and is not intended to limit the scope of the invention. The present invention can be subject to various configurations within the scope of the technical spirit disclosed in the description and Claims. For example, in this example, the lesion area (spread of the lesion in the plane direction (XY direction)) and the invasion depth (infiltration of lesion in the depth direction (Z direction)) were presented as the pathological information, but any information may be presented as the pathological information if the information is on the lesion extracted from the slides and the gross organ. Only the lesion area or only the invasion depth may be presented as the pathological information. In this example, the pathological diagnosis of a stomach cancer was used as an example, but pathological information can be acquired and pathological information image data can be generated by the same processing even if the organ is other than the stomach, or even if the disease is other than a cancer.
Embodiments of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions recorded on a storage medium (e.g., non-transitory computer-readable storage medium) to perform the functions of one or more of the above-described embodiment(s) of the present invention, and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more of a central processing unit (CPU), micro processing unit (MPU), or other circuitry, and may include a network of separate computers or separate computer processors. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2013-224366, filed on Oct. 29, 2013, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2013-224366 | Oct 2013 | JP | national |