The present disclosure relates to an information processing program, an information processing device, an information processing method, and a microscope system.
There is known a microscope system that images an observation target placed on a glass slide with a microscope and generates a whole slide imaging (WSI) image that is a digitized pathological image. This microscope system can extract information by performing various image analysis. Furthermore, in order to perform various types of image analysis on this microscope system, it is necessary for the user to adjust parameters.
Patent Document 1: Japanese Patent Application Laid-Open No. 2013-007849
However, if parameter adjustment is performed using all tissue regions on the WSI image, processing time is required. Accordingly, parameters are adjusted at a higher speed by using a minute region of interest (ROI) on the WSI image. Adjustment of this parameter may increase a processing load of the observer.
Therefore, the present disclosure provides an information processing program, an information processing device, an information processing method, and a microscope system capable of more efficiently performing parameter adjustment.
In order to solve the above problem, according to the present disclosure, there is provided an information processing program for causing a computer to execute:
The determination step may determine not to allow the analysis processing to be performed with the predetermined parameter in a case where the first characteristic information and the second characteristic information are not within a predetermined distance on a statistical distribution.
A display control step of causing a display unit to display a display form in which setting of a new parameter for the analysis processing step is recommended in a case where the determination step determines not to allow the analysis processing to be performed with the predetermined parameter may be further included.
The first characteristic information may be generated on the basis of an image in a predetermined region of the first image, and
at least one of an image in the predetermined region or an image indicating the predetermined region may be displayed as the display form.
A language related to recommending setting of a parameter may be displayed as the display form.
The display control step may cause the display unit to display an image indicating position information of the first characteristic information in the statistical distribution and the display form side by side.
The first characteristic information may be a plurality of pieces of characteristic information selected from a plurality of pieces of characteristic information based on a plurality of processing regions in the first image, and
the display control step may cause the display unit to display images in a processing region corresponding to the selected plurality of pieces of characteristic information side by side.
The first characteristic information may be a plurality of pieces of characteristic information selected from each of clustered regions by clustering a plurality of pieces of characteristic information based on each of a plurality of processing regions in the first image into a plurality of regions, and
the display control step may display the processing regions of each of the clustered regions on the first image in association with each other.
The first characteristic information may be a plurality of pieces of characteristic information selected from a plurality of pieces of characteristic information based on a plurality of processing regions in the first image, and
the determination step may change a determination reference in such a manner that the first characteristic information is not selected in a case where it is determined not to allow the analysis processing to be performed with the predetermined parameter and a parameter based on a processing region corresponding to the first characteristic information is not used for the analysis processing.
The first characteristic information may be a plurality of pieces of characteristic information selected from a plurality of pieces of characteristic information based on a plurality of processing regions in the first image, and
the determination step changes an imaging condition of an imaging device that has captured the first image in such a manner that the first characteristic information is not selected in a case where it is determined not to allow the analysis processing to be performed with the predetermined parameter and a parameter based on a processing region corresponding to the first characteristic information is not used for the analysis processing.
The second characteristic information may be a plurality of pieces of characteristic information calculated from different captured images.
The determination step may change the predetermined distance according to a number of pieces of the second characteristic information corresponding to the predetermined parameter.
A storage step of storing a plurality of parameters used in the analysis processing step in a storage unit in association with the second characteristic information corresponding to the parameters may be further included, in which
in a case where the determination step determines not to allow the analysis processing to be performed with the predetermined parameter, the determination step may cause the storage unit to store a new parameter of the analysis processing step corresponding to the first characteristic information in association with the first characteristic information.
The determination step may determine whether characteristic information similar to the first characteristic information is stored in the storage unit in a case where it is determined not to allow the analysis processing to be performed with the predetermined parameter, and in a case where the characteristic information is not stored, the determination step may cause the storage unit to store a new parameter of the analysis processing step corresponding to the first characteristic information.
In a case where it is determined that the characteristic information similar to the first characteristic information is stored in the storage unit, the determination step may allow the analysis processing to be performed on the basis of a parameter corresponding to the similar characteristic information.
The determination step may allow the analysis processing to be performed with the predetermined parameter in a case where the first characteristic information and the second characteristic information are within a predetermined distance on a statistical distribution.
A display control step of causing a display unit to display a display form indicating that it is possible to perform the analysis processing with the predetermined parameter in a case where the determination step determines to allow the analysis processing to be performed with the predetermined parameter may be further included.
The first characteristic information and the second characteristic information may be at least one of a luminance value, cell density, cell circularity, a cell circumferential length, or a local feature amount, and the distance may be at least one of a Mahalanobis distance or a Euclidean distance.
In order to solve the above problem, according to the present disclosure, there is provided an information processing device including:
In order to solve the above problems, according to the present disclosure, there is provided a microscope system including:
Hereinafter, embodiments of an information processing program, an information processing device, an information processing method, and a microscope system will be described with reference to the drawings. Hereinafter, main components of the information processing program, the information processing device, the information processing method, and the microscope system will be mainly described, but the information processing program, the information processing device, the information processing method, and the microscope system may include components and functions that are not illustrated or described. The following description does not exclude components and functions that are not illustrated or described.
The microscope system 5000 may be configured as a so-called whole slide imaging (WSI) system or a digital pathology system, and may be used for pathological diagnosis. Alternatively, the microscope system 5000 may be designed as a fluorescence imaging system, or particularly, as a multiple fluorescence imaging system.
For example, the microscope system 5000 may be used to make an intraoperative pathological diagnosis or a telepathological diagnosis. In the intraoperative pathological diagnosis, the microscope device 5100 can acquire the data of the biological sample S acquired from the subject of the operation while the operation is being performed, and then transmit the data to the information processing unit 5120. In the telepathological diagnosis, the microscope device 5100 can transmit the acquired data of the biological sample S to the information processing unit 5120 located in a place away from the microscope device 5100 (such as in another room or building). In these diagnoses, the information processing unit 5120 then receives and outputs the data. On the basis of the output data, the user of the information processing unit 5120 can make a pathological diagnosis.
The biological sample S may be a sample containing a biological component. The biological component may be a tissue, a cell, a liquid component of the living body (blood, urine, or the like), a culture, or a living cell (a myocardial cell, a nerve cell, a fertilized egg, or the like).
The biological sample may be a solid, or may be a specimen fixed with a fixing reagent such as paraffin or a solid formed by freezing. The biological sample can be a section of the solid. A specific example of the biological sample may be a section of a biopsy sample.
The biological sample may be one that has been subjected to a treatment such as staining or labeling. The treatment may be staining for indicating the morphology of the biological component or for indicating the substance (surface antigen or the like) contained in the biological component, and can be hematoxylin-eosin (HE) staining or immunohistochemistry staining, for example. The biological sample may be one that has been subjected to the above treatment with one or more reagents, and the reagent (s) can be a fluorescent dye, a coloring reagent, a fluorescent protein, or a fluorescence-labeled antibody.
The specimen may be prepared from a specimen or a tissue sample collected from a human body for the purpose of pathological diagnosis, clinical examination, or the like. Alternatively, the specimen is not necessarily of the human body, and may be derived from an animal, a plant, or some other material. The specimen may differ in property, depending on the type of the tissue being used (such as an organ or a cell, for example), the type of the disease being examined, the attributes of the subject (such as age, gender, blood type, and race, for example), or the subject's daily habits (such as an eating habit, an exercise habit, and a smoking habit, for example). The specimen may be managed by being affixed with identification information (bar-code information, QR-code (trademark) information, or the like) by which each specimen can be identified.
The light irradiation unit 5101 is a light source for illuminating the biological sample S, and is an optical unit that guides light emitted from the light source to a specimen. The light source can illuminate a biological sample with visible light, ultraviolet light, infrared light, or a combination thereof. The light source may be one or more of the following: a halogen lamp, a laser light source, an LED lamp, a mercury lamp, and a xenon lamp. The light source in fluorescent observation may be of a plurality of types and/or wavelengths, and the types and the wavelengths may be appropriately selected by a person skilled in the art. The light irradiation unit may have a configuration of a transmissive type, a reflective type, or an epi-illumination type (a coaxial epi-illumination type or a side-illumination type).
The optical unit 5102 is designed to guide the light from the biological sample S to the signal acquisition unit 5103. The optical unit may be designed to enable the microscope device 5100 to observe or capture an image of the biological sample S.
The optical unit 5102 may include an objective lens. The type of the objective lens may be appropriately selected by a person skilled in the art, in accordance with the observation method. The optical unit may also include a relay lens for relaying an image magnified by the objective lens to the signal acquisition unit. The optical unit may further include optical components other than the objective lens and the relay lens, and the optical components may be an eyepiece, a phase plate, a condenser lens, and the like.
The optical unit 5102 may further include a wavelength separation unit designed to separate light having a predetermined wavelength from the light from the biological sample S. The wavelength separation unit may be designed to selectively cause light having a predetermined wavelength or a predetermined wavelength range to reach the signal acquisition unit. The wavelength separation unit may include one or more of the following: a filter, a polarizing plate, a prism (Wollaston prism), and a diffraction grating that selectively pass light, for example. The optical component (s) included in the wavelength separation unit may be disposed in the optical path from the objective lens to the signal acquisition unit, for example. The wavelength separation unit is provided in the microscope device in a case where fluorescent observation is performed, or particularly, where an excitation light irradiation unit is included. The wavelength separation unit may be designed to separate fluorescence or white light from fluorescence.
The signal acquisition unit 5103 may be designed to receive light from the biological sample S, and convert the light into an electrical signal, or particularly, into a digital electrical signal. The signal acquisition unit may be designed to be capable of acquiring data about the biological sample S, on the basis of the electrical signal. The signal acquisition unit may be designed to be capable of acquiring data of an image (a captured image, or particularly, a still image, a time-lapse image, or a moving image) of the biological sample S, or particularly, may be designed to acquire data of an image enlarged by the optical unit. The signal acquisition unit includes one or more image sensors, CMOSs, CCDs, or the like that include a plurality of pixels arranged in one- or two-dimensional manner. The signal acquisition unit may include an image sensor for acquiring a low-resolution image and an image sensor for acquiring a high-resolution image, or may include an image sensor for sensing for AF or the like and an image sensor for outputting an image for observation or the like. The image sensor may include not only the plurality of pixels, but also a signal processing unit (including one, two or three of the following: a CPU, a DSP, and a memory) that performs signal processing using pixel signals from the respective pixels, and an output control unit that controls outputting of image data generated from the pixel signals and processed data generated by the signal processing unit. Moreover, the imaging element can include an asynchronous event detection sensor that detects, as an event, that a luminance change of a pixel that photoelectrically converts incident light exceeds a predetermined threshold. The image sensor including the plurality of pixels, the signal processing unit, and the output control unit can be preferably designed as a one-chip semiconductor device.
The control unit 5110 controls imaging being performed by the microscope device 5100. For the imaging control, the control unit can drive movement of the optical unit 5102 and/or the sample placement unit 5104, to adjust the positional relationship between the optical unit and the sample placement unit. The control unit 5110 can move the optical unit and/or the sample placement unit in a direction toward or away from each other (in the optical axis direction of the objective lens, for example). The control unit may also move the optical unit and/or the sample placement unit in any direction in a plane perpendicular to the optical axis direction. For the imaging control, the control unit may control the light irradiation unit 5101 and/or the signal acquisition unit 5103.
The sample placement unit 5104 may be designed to be capable of securing the position of a biological sample on the sample placement unit, and may be a so-called stage. The sample placement unit 5104 may be designed to be capable of moving the position of the biological sample in the optical axis direction of the objective lens and/or in a direction perpendicular to the optical axis direction.
The information processing unit 5120 can acquire, from the microscope device 5100, data (imaging data or the like) acquired by the microscope device 5100. The information processing unit can perform image processing on the imaging data. The image processing may include color separation processing. The color separation processing may include a process of extracting data of the optical component of a predetermined wavelength or in a predetermined wavelength range from the imaging data to generate image data, or a process of removing data of the optical component of a predetermined wavelength or in a predetermined wavelength range from the imaging data. The image processing may also include an autofluorescence separation process for separating the autofluorescence component and the dye component of a tissue section, and a fluorescence separation process for separating wavelengths between dyes having different fluorescence wavelengths from each other. The autofluorescence separation process may include a process of removing the autofluorescence component from image information about another specimen, using an autofluorescence signal extracted from one specimen of the plurality of specimens having the same or similar properties.
The information processing unit 5120 may transmit data for the imaging control to the control unit 5110, and the control unit 5110 that has received the data may control the imaging being by the microscope device 5100 in accordance with the data.
The information processing unit 5120 may be designed as an information processing device such as a general-purpose computer, and may include a CPU, RAM, and ROM. The information processing unit may be included in the housing of the microscope device 5100, or may be located outside the housing. Further, the various processes or functions to be executed by the information processing unit may be realized by a server computer or a cloud connected via a network.
The method to be implemented by the microscope device 5100 to capture an image of the biological sample S may be appropriately selected by a person skilled in the art, in accordance with the type of the biological sample, the purpose of imaging, and the like. Examples of the imaging method are described below.
One example of the imaging method is as follows. The microscope device can first identify an imaging target region. The imaging target region may be identified so as to cover the entire region in which the biological sample exists, or may be identified so as to cover the target portion (the portion in which the target tissue section, the target cell, or the target lesion exists) of the biological sample. Next, the microscope device divides the imaging target region into a plurality of divided regions of a predetermined size, and the microscope device sequentially captures images of the respective divided regions. As a result, an image of each divided region is acquired.
As shown in
The positional relationship between the microscope device and the sample placement unit is adjusted so that an image of the next divided region is captured after one divided region is captured. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both. In this example, the imaging device that captures an image of each divided region may be a two-dimensional image sensor (an area sensor) or a one-dimensional image sensor (a line sensor). The signal acquisition unit may capture an image of each divided region via the optical unit. Further, images of the respective divided regions may be continuously captured while the microscope device and/or the sample placement unit is moved, or movement of the microscope device and/or the sample placement unit may be stopped every time an image of a divided region is captured. The imaging target region may be divided so that the respective divided regions partially overlap, or the imaging target region may be divided so that the respective divided regions do not overlap. A plurality of images of each divided region may be captured while the imaging conditions such as the focal length and/or the exposure time are changed.
Furthermore, the information processing device can combine a plurality of adjacent divided regions to generate image data of a wider region. As the combining process is performed on the entire imaging target region, an image of a wider region can be acquired with respect to the imaging target region. Furthermore, image data with lower resolution can be generated from the image of the divided region or the image subjected to the combining process.
Another example of the imaging method is as follows. The microscope device can first identify an imaging target region. The imaging target region may be identified so as to cover the entire region in which the biological sample exists, or may be identified so as to cover the target portion (the portion in which the target tissue section or the target cell exists) of the biological sample. Next, the microscope device scans a region (also referred to as a “divided scan region”) of the imaging target region in one direction (also referred to as a “scan direction”) in a plane perpendicular to the optical axis, and thus captures an image. After the scanning of the divided scan region is completed, the divided scan region next to the scan region is then scanned. These scanning operations are repeated until an image of the entire imaging target region is captured.
As shown in
For the scanning of each divided scan region, the positional relationship between the microscope device and the sample placement unit is adjusted so that an image of the next divided scan region is captured after an image of one divided scan region is captured. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both. In this example, the imaging device that captures an image of each divided scan region may be a one-dimensional image sensor (a line sensor) or a two-dimensional image sensor (an area sensor). The signal acquisition unit may capture an image of each divided region via a magnifying optical system. Also, images of the respective divided scan regions may be continuously captured while the microscope device and/or the sample placement unit is moved. The imaging target region may be divided so that the respective divided scan regions partially overlap, or the imaging target region may be divided so that the respective divided scan regions do not overlap. A plurality of images of each divided scan region may be captured while the imaging conditions such as the focal length and/or the exposure time are changed.
Furthermore, the information processing device can combine a plurality of adjacent divided scan regions to generate image data of a wider region. As the combining process is performed on the entire imaging target region, an image of a wider region can be acquired with respect to the imaging target region. Furthermore, image data with lower resolution can be generated from the image of the divided scan region or the image obtained by the combining process.
As the storage unit 100, for example, a storage device such as a nonvolatile semiconductor memory or a hard disk drive is used. In the storage unit 100, various control parameters, programs, and the like according to the present embodiment are stored in advance. Furthermore, the storage unit 100 includes an input image database 102 and an analyzed ROI database 104.
The input image database 102 stores digital captured images captured by the microscope device 5100. The input image database 102 stores, for example, first region images captured at a plurality of different depths in the optical axis direction of the optical unit of the microscope device 5100. Moreover, the WSI image for each depth generated by performing stitching processing on a plurality of the first region images at the same depth is stored. Furthermore, it may be an image in a partial region indicated by annotation data (such as a tumor region indicated by a pathologist or a researcher) accompanying each image. The dyeability of these captured images may be hematoxylin & eosin (HE) staining, immunohistochemistry (IHC) staining, or fluorescent staining.
The analyzed ROI database 104 stores ROI region information associated with a captured image stored in the input image database 102, a feature amount calculated on the basis of an image in an ROI region, and a parameter associated with the feature amount in association with each other. Note that, in the analyzed ROI database 104, only the feature amount calculated on the basis of the image in the ROI region and the parameter associated with the feature amount may be stored. Furthermore, details of the feature amount and the like will be described later.
The processing unit 200 includes a central processing unit (CPU) and a microprocessor (MPU), and configures each processing unit by executing a program stored in the storage unit 100. The processing unit 200 analyzes a digital captured image captured by the microscope device 5100 and generates analysis information. Note that details of the processing unit 200 will be described later.
The program used by the processing unit 200 may be stored in the storage unit 100, or may be stored in a storage medium such as a digital versatile disc (DVD), a cloud computer, or the like. Furthermore, in the processing unit 200, the program may be executed by a central processing unit (CPU) or a microprocessor (MPU) using a random access memory (RAM) or the like as a work area, or may be executed by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
The operation unit 300 includes, for example, a keyboard, a mouse, and the like. The operation unit 300 inputs an instruction signal corresponding to an operation of an observer (user), for example, a pathologist, to the processing unit 200.
The display unit 400 is, for example, a monitor. As will be described later, the display unit 400 displays a captured image, data related to analysis, a user interface (UI screen), and the like.
Here, details of the processing unit 200 will be described. The processing unit 200 includes an acquisition unit 202, a region extraction unit 204, a feature amount calculation unit 206, a region selection unit 208, a similarity determination processing unit 210, a parameter setting unit 212, an analysis processing unit 214, and a display control unit 216.
The acquisition unit 202 acquires a first image (target image) obtained by imaging a specimen tissue to be analyzed from the input image database 102. Alternatively, the acquisition unit 202 may directly acquire the target image from the microscope device 5100. Alternatively, the acquisition unit 202 may directly acquire the target image from another microscope device, a storage device, or the like via, for example, an in-hospital network.
The region extraction unit 204 sets a region of interest (ROI) for calculating the feature amount in a captured image G100.
The feature amount calculation unit 206 calculates a feature amount from each of the plurality of ROIs set by the region extraction unit 204. For example, the feature amount is a luminance value, cell density, nucleus circularity, a nucleus circumferential length, color information, a frequency characteristic, and Scale Invariant Feature Transform (SIFT), Speed Up Robust Features (SURF), local binary pattern (LBP), Accelerated KAZE (AKAZE), histogram of oriented gradients (HOG), and the like which are local feature amounts. In the present embodiment, these feature amounts are used, but the present embodiment is not limited thereto. For example, the feature amount may be a feature amount output from a neural network (NN) such as a convolutional neural network (CNN). As described above, the feature amount is only required to be a feature amount used for classification, identification, recognition, and the like of a pathological form by extracting image characteristics of a pathological image. For example, manually designed feature amounts include a color feature (luminance value, staining intensity, or the like), a shape feature (circularity, circumferential length, or the like), a density, a distance from a specific form, and a local feature amount (AKAZE, SIFT, HOG, SURF, LBP, or the like) of the specific form as described above. Furthermore, in a case where the data to be analyzed is IHC staining or fluorescent staining, information such as the number of positive/negative cells calculated in the tissue may be used.
The region selection unit 208 selects a representative ROI on the basis of the distribution of the feature amount calculated from each of the pieces of image information in the plurality of ROIs set by the region extraction unit 204.
The region selection unit 208 clusters the feature amount and selects a representative ROI of each clustered region. As illustrated in
For example, the region selection unit 208 selects a representative ROI from among clusters G100, G102, and G104. For example, the region selection unit 208 according to the present embodiment sets the ROI closest to the centroid of each of the clusters G100, G102, and G104 as the representative ROI, and selects F100, F102, and F104 as feature amounts of the representative ROI. As described above, the representative ROI and the feature amount corresponding to the representative ROI can be selected by the statistical processing. Thus, since the representative ROI and the feature amounts corresponding to the representative ROI are selected without depending on the experience of the observer, it is possible to improve the reproducibility in selectivity of the representative ROI and the feature amounts corresponding to the representative ROI.
The similarity determination processing unit 210 determines similarity between the feature amount of the ROI corresponding to the predetermined parameter stored in the analyzed ROI database 104 and the feature amount selected by the region selection unit 208. In a case of determining that the feature amount of the ROI corresponding to the predetermined parameter is similar to the feature amount selected by the region selection unit 208, the similarity determination processing unit 210 supplies the predetermined parameter to the parameter setting unit 212. Note that, in the present embodiment, determination of similarity between feature amounts may be referred to as comparison of feature amounts.
More specifically, the similarity determination processing unit 210 can perform the similarity determination on the basis of the feature amounts F100, F102, and F104 corresponding to the respective representative ROIs in the characteristic information space and a distance, a spatial vector, or the like between the feature amounts corresponding to the respective ROIs to be compared. For example, the similarity determination processing unit 210 determines that they are similar in a case where a Euclidean distance or a Mahalanobis distance, which is a geometric linear distance, is equal to or less than a threshold. Alternatively, for example, the similarity determination processing unit 210 may determine that they are similar in a case where a cosine similarity value indicating closeness in direction of the spatial vector is equal to or more than a threshold. Alternatively, after the similarity is compared in each characteristic information space as described above, similarity between sets may be used with a common term having close distance/vector. As an index at that time, a Jaccard coefficient, a Dice coefficient, or a Simpson coefficient, which are similarity indexes between sets, may be used.
In a case of determining that the feature amounts F100, F102, and F104 are similar to the feature amount of the ROI corresponding to the parameter, the similarity determination processing unit 210 causes the display unit 400 via the display control unit to display a display form indicating to the user that analysis can be performed with the parameter being used. On the other hand, in a case of determining that the feature amounts F100, F102, and F104 are not similar to the feature amount of the ROI corresponding to the parameter, the similarity determination processing unit 210 causes the display unit 400 via the display control unit to display a display form prompting the observer (user) to readjust the parameter. Note that the similarity determination processing unit 210 according to the present embodiment corresponds to a determination processing unit.
The parameter setting unit 212 sets the parameter supplied from the similarity determination processing unit 210 in the analysis processing unit 214. Furthermore, the parameter setting unit 212 can also set a parameter that is set via the operation unit 300 in the analysis processing unit 214.
The analysis processing unit 214 analyzes the target image using the parameter set by the parameter setting unit 212. For example, the analysis processing unit 214 performs analysis processing according to each case using the set parameter. The analysis processing according to the present embodiment performs, for example, processing of extracting a lesion from a pathological image that is a target image, but is not limited thereto. For example, in the analysis processing, it is also possible to perform processing of extracting a target structure such as a cell nucleus from a pathological image that is a target image. Furthermore, for example, the analysis processing unit 214 may configure a plurality of image processing filters according to the set parameter, and perform analysis processing such as extracting a lesion by a combination of the plurality of image processing filters. Alternatively, processing including a combination of frequency processing and gradation conversion processing may be performed according to the set parameter.
The display control unit 216 performs processing of causing the display unit 400 to display various images. The display control unit 216 can also perform processing of reading a predetermined UI screen stored in the storage unit 100 and causing the display unit 400 to display the UI screen. In this case, an instruction signal input via the UI screen is supplied to each unit of the processing unit 200.
Next, the region extraction unit 204 sets an ROI for calculating a feature amount in the captured image G100, the feature amount calculation unit 206 calculates a feature amount from each of the plurality of ROIs set by the region extraction unit 204, and the region selection unit 208 selects a representative ROI from each of the clusters G100, G102, and G104, for example (step S102).
Next, the similarity determination processing unit 210 determines similarity between the feature amount of the ROI corresponding to the predetermined parameter stored in the analyzed ROI database 104 and the feature amount selected by the region selection unit 208 (step S106). In a case of determining that they are similar (y in step S106), the similarity determination processing unit 210 causes the display unit 400 via the display control unit to display a display form indicating that analysis can be performed by the predetermined parameter adjusted in the past by the user (step S108).
On the other hand, in a case of determining that they are not similar (n in step S106), the similarity determination processing unit 210 causes the display unit 400 via the display control unit 216 to display a display form prompting the observer to readjust the parameter (step S110), and ends the entire processing.
As described above, the acquisition unit 202 acquires an image obtained by imaging a specimen tissue, and the analysis processing unit 214 analyzes the image using the predetermined parameter. In this case, the similarity determination processing unit 210 determines whether to allow the analysis processing to be performed with the predetermined parameter on the basis of the characteristic information of the tissue form in the image in the ROI existing in the image and the second characteristic information of the tissue form in the image corresponding to the predetermined parameter. Thus, it is possible to objectively determine whether or not the analysis processing can be performed on the image acquired by the acquisition unit 202 using the predetermined parameter. By extracting the ROI from the image and performing the similarity determination processing of the characteristic information based on the ROI in this manner, it is possible to objectively determine whether or not parameters adjusted in the past are applicable to a new image to be analyzed without experience or knowledge.
The microscope system 5000 according to a second embodiment is different from the microscope system 5000 according to the first embodiment in that the processing unit 200 further includes a parameter adjustment control unit 218. Hereinafter, differences from the microscope system 5000 according to the first embodiment will be described.
The parameter adjustment control unit 218 can perform control processing when performing parameter adjustment.
Furthermore, the parameter adjustment control unit 218 displays a pathological image W112a as the target image in the screen region W112 and illustrates the position of the representative ROI. Similarly, the parameter adjustment control unit 218 displays a result image W112b of the clustering by the region selection unit 208 on the screen region W112. In the result image W112b, regions close to the characteristic information are visualized by displaying the same color, or the like. Thus, the correspondence between the characteristic information and visual characteristics can be fed back to the user. By performing this feedback, for example, the user can monitor the validity of the cluster division number or the like.
Similarly, the parameter adjustment control unit 218 causes the display unit 400 via the display control unit 216 to display the UI screen U100 and a distribution image W114a of the feature amount of each ROI read from the storage unit 100 by the similarity determination processing unit 210 in the screen region W114. Furthermore, the ROI 6 determined not to be similar by the similarity determination processing unit 210 is also displayed in the distribution image W114a. Thus, the statistical positional relationship of the ROI 6 determined not to be similar can be confirmed on the feature amount distribution diagram. Therefore, the observer can objectively grasp the statistical position of the ROI 6 determined not to be similar.
Similarly, the parameter adjustment control unit 218 causes the display unit 400 via the display control unit 216 to display an enlarged view 116a of the image in the ROI 6 determined not to be similar to the screen region W116. Thus, the observer can confirm the pathological form of the ROI 6 determined not to be similar in more detail, and can grasp the cause of a distribution deviation of the feature amount on a statistical distribution.
Moreover, the parameter adjustment control unit 218 displays a display form W116b prompting the observer to readjust the parameter. Furthermore, the display form W116b is used to input a signal (y or n) indicating whether or not readjustment of the parameter is accepted.
With such a display form, the observer can objectively grasp the information of the ROI 6 determined not to be similar by the similarity determination processing unit 210. In this manner, by adding a mechanism for recommending an ROI dissimilar to the analyzed ROI, it is possible to select an ROI suitable for parameter adjustment without depending on the experience or pathological knowledge of the user.
Furthermore, in a case where the ROI 6 dissimilar to the analyzed ROI is detected by the similarity determination processing unit 210, the detected ROI 6 includes characteristic information that has not been analyzed before. That is, it is indicated that the recommended ROI 6 is more appropriate as an ROI for newly performing parameter adjustment. In this way, if it is an ROI whose characteristic information is separated from that of the analyzed ROI, it is considered that the usefulness of the parameter adjustment is high.
Next, the observer supplies a new parameter to the parameter setting unit 212 via the operation unit 300 (step S202). Then, the parameter adjustment control unit 218 stores the new parameter supplied via the operation unit 300 and the parameter setting unit 212 in association with the feature amount of the corresponding ROI 6 in the analyzed ROI database 104, and ends the entire processing.
In a case where a signal (n) indicating that the recommended ROI 6 is not selected is input (y in step S302), the parameter setting unit 212 sets a parameter related to any ROI in the analysis target in the analysis processing unit 214 and performs analysis processing.
Subsequently, the parameter adjustment control unit 218 feeds back imaging information related to the recommended ROI 6 to the microscope device 5100 (step S304). For example, an imaging condition for preventing acquisition of the feature amount related to the recommended ROI 6 is fed back to the microscope device 5100 (step S306). For example, the parameter adjustment control unit 218 supplies information for changing the wavelength band and illuminance in the light source of the microscope device 5100 to the microscope device 5100. Thus, for example, the microscope device 5100 can adjust the wavelength and illuminance of the light source in the light irradiation unit 5101 so as to suppress the acquisition of the statistical information related to the recommended ROI 6.
As described above, in the present embodiment, the parameter adjustment control unit 218 causes the display unit 400 via the display control unit 216 to display the recommended ROI dissimilar to the analyzed ROI. Thus, it is possible to select an ROI suitable for parameter adjustment without depending on the experience or pathological knowledge of the user.
The microscope system 5000 according to a third embodiment is different from the microscope system 5000 according to the second embodiment in that the processing unit 200 further includes a determination reference learning unit 220. Hereinafter, differences from the microscope system 5000 according to the second embodiment will be described.
As described above, in a case where the representative ROI 6 (see
The determination reference learning unit 220 causes, for example, the similarity determination reference of the similarity determination processing unit 210 to be relearned. As a relearning method, the determination reference learning unit 220 automatically updates the thresholds related to the Mahalanobis distance and the spatial vector used by the similarity determination processing unit 210. Thus, the threshold can be updated so that the statistic of the representative ROI 6 that has been rejected is determined to be “similar” to the analyzed ROI. Moreover, the learning may be performed so that the feature amount of the representative ROI 6 that has been rejected is plotted at a position close to the analyzed ROI in the distance space where the feature amount of the representative ROI 6 is generated by using deep metric learning or the like. In this way, it is possible to incorporate pathological knowledge or the like possessed by the user into the similarity determination and control the similarity determination reference of the system.
As described above, according to the present embodiment, in a case where the similarity determination processing unit 210 determines that the representative ROI 6 (see
The microscope system 5000 according to a fourth embodiment is different from the microscope system 5000 according to the second embodiment in that the similarity determination processing unit 210 determines whether there is similar information in the characteristic information stored in the storage unit 100 in a case of determining that the representative ROI 6 (see
On the other hand, in a case where it is not determined (n in step S500), steps S200 to S204 described above are executed.
As described above, according to the present embodiment, in a case where the similarity determination processing unit 210 determines that the representative ROI 6 (see
The microscope system 5000 according to the fifth embodiment is different from the microscope system 5000 according to the fourth embodiment in that the determination distance used in the similarity determination processing unit 210 is changed according to the number of pieces of characteristic information stored in the storage unit 100. Hereinafter, differences from the microscope system 5000 according to the fourth embodiment will be described.
The number of pieces of characteristic information associated with the new parameter increases as the number of captured images increases. Therefore, as the characteristic information increases, by shortening the predetermined Mahalanobis distance used for determination, it is possible to make the expansion of the determination region of the similarity of the representative ROI constant. Thus, the determination processing accuracy of the similarity determination processing unit 210 can be suppressed with respect to variation in the number of pieces of data of the characteristic information.
Note that the present technology can have the following configurations.
(1) An information processing program, including:
(2) The information processing program according to (1), in which the determination step determines not to allow the analysis processing to be performed with the predetermined parameter in a case where the first characteristic information and the second characteristic information are not within a predetermined distance on a statistical distribution.
(3) The information processing program according to (2), further including a display control step of causing a display unit to display a display form in which setting of a new parameter for the analysis processing step is recommended in a case where the determination step determines not to allow the analysis processing to be performed with the predetermined parameter.
(4) The information processing program according to (3), in which
(5) The information processing program according to (3), in which a language related to recommending setting of a parameter is displayed as the display form.
(6) The information processing program according to any one of (3) to (5), in which the display control step causes the display unit to display an image indicating position information of the first characteristic information in the statistical distribution and the display form side by side.
(7) The information processing program according to any one of (3) to (6), in which the first characteristic information is a plurality of pieces of characteristic information selected from a plurality of pieces of characteristic information based on a plurality of processing regions in the first image, and
(8) The information processing program according to any one of (3) to (6), in which
(9) The information processing program according to any one of (1) to (6), in which
(10) The information processing program according to any one of (1) to (6), in which
(11)
The information processing program according to any one of (2) to (10), in which the second characteristic information is a plurality of pieces of characteristic information calculated from different captured images.
(12)
The information processing program according to (9), in which the determination step changes the predetermined distance according to a number of pieces of the second characteristic information corresponding to the predetermined parameter.
(13) The information processing program according to any one of (1) to (12), further including
(14) The information processing program according to (13), in which the determination step determines whether or not characteristic information similar to the first characteristic information is stored in the storage unit in a case where it is determined not to allow the analysis processing to be performed with the predetermined parameter, and in a case where the characteristic information is not stored, the determination step causes the storage unit to store a new parameter of the analysis processing step corresponding to the first characteristic information.
(15) The information processing program according to (14), in which in a case where it is determined that the characteristic information similar to the first characteristic information is stored in the storage unit, the determination step allows the analysis processing to be performed on the basis of a parameter corresponding to the similar characteristic information.
(16) The information processing program according to (1), in which the determination step allows the analysis processing to be performed with the predetermined parameter in a case where the first characteristic information and the second characteristic information are within a predetermined distance on a statistical distribution.
(17) The information processing program according to (1), further including a display control step of causing a display unit to display a display form indicating that it is possible to perform the analysis processing with the predetermined parameter in a case where the determination step determines to allow the analysis processing to be performed with the predetermined parameter.
(18) The information processing program according to (2), in which the first characteristic information and the second characteristic information are at least one of a luminance value, cell density, cell circularity, a cell circumferential length, or a local feature amount, and the distance is at least one of a Mahalanobis distance or a Euclidean distance.
(19) An information processing device, including:
(20) An information processing method, including:
(21) A microscope system, including:
Aspects of the present disclosure are not limited to the above-described individual embodiments, but include various modifications that can be conceived by those skilled in the art, and the effects of the present disclosure are not limited to the above-described contents. That is, various additions, modifications, and partial deletions are possible without departing from the conceptual idea and spirit of the present disclosure derived from the matters defined in the claims and equivalents thereof.
100 Storage unit
200 Processing unit
202 Acquisition unit
204 Region extraction unit
206 Feature amount calculation unit
208 Region selection unit
210 Similarity determination processing unit
212 Parameter setting unit
214 Analysis processing unit
216 Display control unit
5000 Microscope system
400 Display unit
5120 Information processing device
5100 Microscope device
Number | Date | Country | Kind |
---|---|---|---|
2021-096758 | Jun 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/008730 | 3/2/2022 | WO |