The present disclosure relates to an information processing apparatus, a biological sample observation system, and an image generation method.
For example, in some cases, a color-separated image in multiplexed fluorescence images has low signal intensity, which can lead to being obscured or buried in the background (resulting in low S/N), depending on the types of dye or antibody involved. This may cause difficulty in understanding from a biological point of view. As an example, CD3, CD5, and CD7 are all markers expressed in the T cell region, but depending on the combination with the dye, S/N may become low for some markers.
In this regard, for example, to remove noise from a processing-target image, Patent Literature 1 discloses a technique that uses a tomographic image before drug administration or a tomographic image subjected to noise removal processing as a guidance image, thus performing noise removal processing on the processing-target image using a guided filter.
Patent Literature 1: JP 2019-113475 A
However, due to the mechanism of the color separation algorithm (spectral color separation), the acquired signals are divided by coefficients based on the spectrum. In cases where the spectral shapes are similar, or the signals are inherently small, a color-separated image with a low S/N (signal-to-noise ratio) will be obtained. Furthermore, even if the obtained image undergoes noise removal (NR) processing with a general isotropic filter, it could smooth out even the signals necessary for subsequent cell analysis. Thus, there is a demand for a technique that can retain the necessary signal intensity for analysis, such as cell analysis, while extracting a necessary signal obscured or buried in the background of the processing-target image. This requirement applies not only to color-separated images but also to other processing-target images.
Thus, the present disclosure provides an information processing apparatus, a biological sample observation system, and an image generation method capable of acquiring a necessary signal obscured or buried in the background of a processing-target image while maintaining the signal intensity necessary for analysis.
An information processing apparatus according to the embodiment of the present disclosure includes: a guide image generation unit configured to sum up a plurality of images each including spectral information regarding a biomarker, and perform a division on a result by a number of summed images to generate a guide image for correction.
A biological sample observation system according to the embodiment of the present disclosure includes: an image-capturing device configured to acquire a plurality of images each including spectral information regarding a biomarker; and an information processing apparatus configured to process the plurality of images, wherein the information processing apparatus includes a guide image generation unit configured to sum up the plurality of images and perform a division on a result by a number of summed images to generate a guide image for correction.
An image generation method according to the embodiment of the present disclosure includes: summing up a plurality of images each including spectral information regarding a biomarker, and performing a division on a result by a number of summed images to generate a guide image for correction.
Embodiments of the present disclosure are now described in detail with reference to the drawings. Moreover, the present embodiments do not limit the apparatus, system, method, and the like according to the present disclosure. Further, in the specification and the drawings, components having substantially the same functional configuration are designated by basically the same reference numerals, so redundant description will be omitted.
One or more embodiments described herein can each be implemented independently. On the other hand, at least a portion of the plurality of embodiments described herein can be implemented in combination with at least a portion of other embodiments as appropriate. These multiple embodiments may include novel features that are different from each other. Thus, these multiple embodiments can contribute to solving mutually different objectives or challenges and can produce mutually different effects.
The present disclosure is now described in accordance with the order of items illustrated below.
The major technical details according to the present disclosure are now described with reference to
As illustrated in
Guide (1) is an image obtained by performing simple merging on a plurality of multispectral images. Guide (2) is an image obtained by performing image processing (e.g., median filter, deconv) on the image of Guide (1). Guide (3) is an image obtained by merging a plurality of multispectral images with a value equal to or less than the positive threshold that is set to zero. Guide (4) is an image obtained by performing image processing (e.g., median filter, deconv) on the image of Guide (3). Guide (5) is an image obtained by merging a plurality of multispectral images corresponding only to a membrane-stained marker. Guide (6) is an image obtained by performing image processing (e.g., median filter, deconv) on the image of Guide (5). Guide (7) is an image obtained by merging a plurality of multispectral images corresponding only to the membrane-stained marker, with a value equal to or less than a positive threshold set to zero. Guide (8) is an image obtained by performing image processing (e.g., median filter, deconv) on the image of Guide (7). Guide (9) is an image obtained by weighting the image of Guide (7) with the expression ratio.
Generation processing of the guide images such as Guides (1) to (9), Correction processing using the guide images, or the like will be described in detail in later embodiments, but the guide images are ones functioning as the guide image used for NR correction of the multispectral images. For example, due to the characteristics of spectral color separation, it is challenging to obtain a color-separated image with low S/N in a case where the spectral shapes are similar or the acquired signal is inherently small. To solve this challenge, a high S/N image is used as the guide image, and NR correction is applied, thus allowing for the restoration of a necessary signal obscured or buried in the background without weakening the signal intensity required for cell analysis. For example, a guide image with a high S/N ratio can be prepared, and a signal only from spatially correlated positions between the guide image and an NR target image can be retained, while smoothing the rest. This allows for the retention of a signal necessary for cell analysis while eliminating just unnecessary background signals. Additionally, a result that is describable from a biological perspective can be obtained. As a result, this can lead to improved diagnostic accuracy. Furthermore, it is possible to correct an NR target image with low S/N, based on a guide image created with the same cell type (e.g., a marker specifically expressed in the T cell region). Thus, using a guide image created with a marker expressed in a specific cell type makes it possible to improve the result of cell analysis limited to the concerned cell type.
An exemplary configuration of an information processing system according to the present embodiment is described with reference to
As illustrated in
The fluorescent reagent 10A is a chemical used to stain the specimen 20A. The fluorescent reagent 10A is, for example, a fluorescent antibody, a fluorescent probe, or a nuclear staining reagent, but the type of the fluorescent reagent 10A is not limited to these particular ones. Examples of fluorescent antibodies include a primary antibody used for direct labeling or a secondary antibody used for indirect labeling. Additionally, the fluorescent reagent 10A is managed with identification information used to identify the fluorescent reagent 10A and the production lot of the fluorescent reagent 10A. This identification information will be referred herein to as “reagent identification information 11A”. The reagent identification information 11A is, for example, barcode information such as one-dimensional barcode information or two-dimensional barcode information, but is not limited to such type of information. Even if the fluorescent reagent 10A is the same type of product, its properties differ for each production lot depending on the production method, the state of the cells from which the antibody is obtained, or the like. For example, in the fluorescent reagent 10A, spectral information, quantum yield, fluorescent labeling rate, or the like differ for each production lot. The fluorescence labeling rate is also called “F/P value: Fluorescein/Protein” and refers to the number of fluorescent molecules that label an antibody. Thus, in the information processing system according to the present embodiment, the fluorescent reagent 10A is managed for each production lot by being assigned with the reagent identification information 11A. In other words, reagent information for each fluorescent reagent 10A is managed for each production lot. This allows the information processing apparatus 100 to separate a fluorescent signal and an autofluorescent signal while taking into consideration slight differences in properties that appear for each production lot. Moreover, managing the fluorescent reagent 10A in units of production lots is just an example, and the fluorescent reagents 10A may be managed on a finer unit than production lots.
The specimen 20A is prepared from a specimen or tissue sample collected from a human body for the purpose of pathological diagnosis or clinical examination. Regarding the specimen 20A, for example, the type of tissue used such as an organ or cell, the type of disease to be targeted, attributes of the subject such as age, gender, blood type, or race, or lifestyle habits of the subjects such as diet, exercise, or smoking habits, are not limited to the particular examples. Furthermore, the specimen 20A is managed with identification information that allows each specimen 20A to be identified. This identification information will be referred herein to as “specimen identification information 21A”. The specimen identification information 21A is, similar to the reagent identification information 11A, for example, barcode information such as one-dimensional barcode information or two-dimensional barcode information, but is not limited thereto. The properties of the specimen 20A vary depending on the type of tissue used, the type of disease targeted, the attributes of the subject, or the lifestyle habits of the subject. For example, in the specimen 20A, measurement channels or spectral information may vary depending on the type of tissue used or the like. Thus, in the information processing system according to the present embodiment, the specimen 20A is individually managed by attaching the specimen identification information 21A. This allows the information processing apparatus 100 to separate the fluorescent signal and the autofluorescent signal, taking into consideration even the slight differences in properties that appear for each specimen 20A.
The fluorescent-stained specimen 30A is created by staining the specimen 20A with the fluorescent reagent 10A. In the present embodiment, the fluorescent-stained specimen 30A assumes that the specimen 20A is stained with at least one fluorescent reagent 10A, but the number of fluorescent reagents 10A used for staining is not limited to a particular one. Furthermore, the staining method is determined by various combinations of the specimen 20A and the fluorescent reagent 10A, and is not limited to a particular one. The fluorescent-stained specimen 30A is input to and image-captured by the information processing apparatus 100.
The information processing apparatus 100, as illustrated in
The acquisition unit 110 is configured to acquire information used in various types of processing in the information processing apparatus 100. As illustrated in
The information acquisition unit 111 is configured to acquire reagent information and specimen information. More specifically, the information acquisition unit 111 acquires the reagent identification information 11A attached to the fluorescent reagent 10A being used to generate the fluorescent-stained specimen 30A, and it also acquires the specimen identification information 21A attached to the specimen 20A. For example, the information acquisition unit 111 acquires the reagent identification information 11A and the specimen identification information 21A using a barcode reader or the like. Then, the information acquisition unit 111 acquires reagent information from the database 200 on the basis of the reagent identification information 11A and acquires specimen information on the basis of the specimen identification information 21A. The information acquisition unit 111 stores the acquired information in an information storage unit 121, which will be described later.
The image acquisition unit 112 is configured to acquire image information of the fluorescent-stained specimen 30A and the specimen 20A stained with at least one fluorescent reagent 10A. More specifically, the image acquisition unit 112 includes an optional image sensor such as a CCD or CMOS to acquire image information by image-capturing the fluorescent-stained specimen 30A using the image sensor. In this regard, it should be noted that “image information” is a concept that includes not only the image itself of the fluorescent-stained specimen 30A but also measurements that are not visualized as an image, such as numerical values. For example, the image information may include information regarding the wavelength spectrum of fluorescence emitted from the fluorescent-stained specimen 30A. The wavelength spectrum of fluorescence is referred herein to as a fluorescence spectrum. The image acquisition unit 112 stores the image information in an image information storage unit 122, which will be described later.
The storage unit 120 is configured to store information used in various types of processing in the information processing apparatus 100 or store information output from various types of processing. As illustrated in
The information storage unit 121 is configured to store the reagent information and specimen information acquired by the information acquisition unit 111. Moreover, after completing analysis processing by an analysis unit 131 and image information generation processing by an image generation unit 132, that is, completing image information reconstruction processing, which will be described later, the information storage unit 121 may increase empty or available space by deleting the reagent information and specimen information used in the processing.
The image information storage unit 122 is configured to store the image information of the fluorescent-stained specimen 30A acquired by the image acquisition unit 112. Moreover, similarly to the information storage unit 121, after completing the analysis processing by the analysis unit 131 and the image information generation processing by the image generation unit 132, that is, completing the image information reconstruction processing, the image information storage unit 122 may increase available space by deleting image information used for processing.
The analysis result storage unit 123 is configured to store a result obtained from the analysis processing performed by the analysis unit 131, which will be described later. For example, the analysis result storage unit 123 stores a fluorescence signal of the fluorescent reagent 10A or an autofluorescence signal of the specimen 20A, which are separated by the analysis unit 131. Additionally, the analysis result storage unit 123 separately provides the results obtained from the analysis processing to the database 200 to improve the analysis accuracy through machine learning or the like. Moreover, after providing the results of the analysis processing to the database 200, the analysis result storage unit 123 may increase its available space by appropriately deleting the result of the analysis processing that it has stored.
The processing unit 130 has a functional configuration that performs various types of processing using the image information, reagent information, and specimen information. As illustrated in
The analysis unit 131 is configured to perform various types of analysis processing using the image information, specimen information, and reagent information. For example, the analysis unit 131 performs processing of separating the autofluorescence signal of the specimen 20A from the image information on the basis of the specimen information and reagent information. This autofluorescence signal, for instance, includes an autofluorescence spectrum as an example of an autofluorescent component, and the fluorescence signal of the fluorescent reagent 10A, for instance, includes a stained fluorescence spectrum as an example of the stained fluorescent component.
More specifically, the analysis unit 131 recognizes one or more components that constitute the autofluorescence signal on the basis of the measurement channel included in the specimen information. For example, the analysis unit 131 recognizes one or more autofluorescence components that constitute the autofluorescence signal. Then, by using the spectral information of these autofluorescence components included in the specimen information, the analysis unit 131 predicts the autofluorescence signal included in the image information. Then, the analysis unit 131 separates the autofluorescence signal and the fluorescence signal from the image information on the basis of the spectral information of the fluorescent component of the fluorescent reagent 10A included in the reagent information and the predicted autofluorescence signal.
In this regard, in the case where the specimen 20A is stained with two or more fluorescent reagents 10A, the analysis unit 131 separates the fluorescence signal of each of these two or more fluorescent reagents 10A from the image information or from the fluorescence signal separated from the autofluorescence signal, based on the specimen information and reagent information. For example, the analysis unit 131 separates the fluorescence signal of each of the fluorescent reagents 10A from the entire fluorescence signals separated from the autofluorescence signal using the spectral information of the fluorescent component of each of the fluorescent reagents 10A included in the reagent information.
Further, in the case where the autofluorescence signal is composed of two or more autofluorescence components, the analysis unit 131 separates the autofluorescence signal of each of these autofluorescent components from the image information or from the autofluorescence signal separated from the fluorescence signals, based on the specimen information and reagent information. For example, the analysis unit 131 uses the spectral information of each autofluorescent component included in the specimen information to separate the autofluorescent signal of each autofluorescent component from the entire autofluorescent signal after being separated from the fluorescent signal.
The analysis unit 131, having separated the fluorescent signal and the autofluorescent signal, performs various types of processing using these signals. For example, the analysis unit 131 may use the separated autofluorescence signal to perform subtraction processing on the image information of another specimen 20A, extracting the fluorescence signal from the image information of the other specimen 20A. The subtraction processing is also called “background subtraction processing”. In the case where there are multiple specimens 20A that are identical or similar in terms of the tissue used for the specimen 20A, the type of disease being targeted, the attribute of the subject, the subject's lifestyle habits, or the like, the autofluorescence signals of these specimens 20A are likely to be similar. The term “similar specimen 20A” used herein includes, for example, a tissue section before staining that is to be stained, a section adjacent to the stained section, a different section within the same block as the stained section, or a section from a different block within the same tissue, including a section taken from a different patient or the like. The tissue section is simply referred to herein as “section”. The “same block” refers to one sampled from the same location as the stained section. The “different block” refers to one sampled from a location different from the stained section. Thus, in the case where the analysis unit 131 is capable of extracting the autofluorescence signal from a certain specimen 20A, the analysis unit 131 may also extract the fluorescence signal from the image information of another specimen 20A by removing the concerned autofluorescence signal from the other specimen 20A. Furthermore, in calculating the S/N ratio using the image information of the other specimen 20A, it is possible for the analysis unit 131 to improve the S/N ratio by using the background after removing the autofluorescence signal.
Further, in addition to the background subtraction processing, the analysis unit 131 is capable of performing various types of processing using the separated fluorescent signal or the separated autofluorescent signal. For example, using these signals, the analysis unit 131 is capable of analyzing the fixation state of the specimen 20A, or performing segmentation or regional fragmentation to recognize a region containing an object in the image information. Examples of such an object include a cell, an intracellular structure, or a tissue. Examples of the intracellular structure include cytoplasm, cell membrane, nucleus, or the like. Examples of the tissue include a tumor site, non-tumor site, connective tissue, blood vessel, vascular wall, lymph vessel, fibrotic structure, necrosis, or the like.
The image generation unit 132 is configured to generate image information on the basis of the fluorescence signal or autofluorescence signal separated by the analysis unit 131, in other words, it reconstructs the image information. For example, the image generation unit 132 is capable of generating image information that includes only the fluorescent signal or only the autofluorescent signal. In this event, if the fluorescent signal is composed of a plurality of fluorescent components, or if the autofluorescent signal is composed of a plurality of autofluorescent components, the image generation unit 132 is capable of generating image information for each component. Furthermore, in the case where the analysis unit 131 performs various types of processing using the separated fluorescent signal or autofluorescent signal, the image generation unit 132 may generate image information indicating the result of those processing operations. Examples of the various types of processing include analysis of the fixation state of the specimen 20A, segmentation, calculation of the S/N value, or the like. This configuration makes it possible to visualize the distribution information of the fluorescent reagent 10A labeled on the target molecule or the like, that is, the two-dimensional spread, intensity, wavelength, and positional relationship of the fluorescence, and particularly, to improve the visibility of information regarding target substances for users such as doctors and researchers in the complex tissue image analysis region.
Furthermore, the image generation unit 132 may be controlled to distinguish the fluorescence signal from the autofluorescence signal on the basis of the fluorescence signal or autofluorescence signal separated by the analysis unit 131, and it may generate image information accordingly. Specifically, it may generate the image information by the control of enhancing the brightness of the fluorescence spectrum of the fluorescent reagent 10A labeled on target molecules, extracting and changing the color of only the fluorescence spectrum of the labeled fluorescent reagent 10A, extracting the fluorescence spectrum of two or more fluorescent reagents 10A from the specimen 20A labeled with two or more fluorescent reagents 10A and changing each to a different color, extracting only the autofluorescence spectrum of the specimen 20A and performing division or subtraction, improving the dynamic range, or the like. This enables the user to clearly distinguish the color information derived from the fluorescent reagent bound to the desired target substance, thus improving the user's visibility.
The guide image generation unit 133 generates a guide image for correction by merging a plurality of color-separated images (an example of multispectral images) and then performing a division on the result by the number of merged images. The color-separated image is an image generated by color separation processing. Furthermore, summing up images and then performing a division involves summing up the signal intensities of the images and then performing a division on the result by the number of summed images. Additionally, the guide image generation unit 133 is capable of executing image processing after the merging and dividing processing upon generation of the guide image, or performing zero-filling processing on the color-separated images before the merging and dividing processing. In the image processing, for example, a noise removal filter, an edge enhancement filter, or the like is used. Details of such guide image generation processing and the like will be described later.
The correction unit 134 performs noise reduction (NR) correction on the color-separated image (an example of a processing-target image) using the generated guide image. Additionally, the correction unit 134 is capable of performing outlier processing on the color-separated image before the correction processing. The outlier processing involves, for example, removing signal intensity values that significantly deviate from other signal intensity values, such as those of red blood cells. Details on such correction processing or the like will be described later.
The display unit 140 is configured to present to the user the corrected image information (information regarding the corrected image) generated by the correction unit 134 by displaying it on a display. Moreover, the type of display used as the display unit 140 is not limited to a particular one. Additionally, although not described in detail in the present embodiment, the image information subjected to correction generated by the correction unit 134 may be presented to the user by being projected by a projector or printed by a printer. In other words, the method of outputting the corrected image information is not limited to a particular one.
The control unit 150 is a functional configuration that centrally controls the overall processing performed by the information processing apparatus 100. For example, the control unit 150 controls the start and end of various types of processing as described above on the basis of operation input by the user made through the operation unit 160. Examples of the various types of processing may include imaging processing of the fluorescent-stained specimen 30A, analysis processing, image information generation processing, guide image information generation processing, image information correction processing, and image information display processing. Examples of the image information generation processing may include an image information reconstruction processing. Moreover, the details of the control by the control unit 150 are not limited to a particular one. For example, the control unit 150 may control processing commonly performed in general-purpose computers, PCs, tablet PCs, or the like, such as processing related to an operating system (OS).
The operation unit 160 is configured to receive operation input from the user. More specifically, the operation unit 160 includes various input means such as a keyboard, mouse, button, touch panel, or microphone, and the user is able to perform various operations on the information processing apparatus 100 by operating these input means. Information regarding the operation input performed through the operation unit 160 is provided to the control unit 150.
The database 200 is a device that manages the specimen information, the reagent information, and the result of analysis processing. More specifically, the database 200 associates and manages the specimen identification information 21A and the specimen information and associates and manages the reagent identification information 11A and the reagent information. This arrangement makes it possible for the information acquisition unit 111 to acquire the specimen information on the basis of the specimen identification information 21A of the specimen 20A to be measured from the database 200 and acquire the reagent information on the basis of the reagent identification information 11A of the fluorescent reagent 10A from the database 200.
The specimen information managed by the database 200, as described above, is information including the intrinsic measurement channels and spectral information of autofluorescent component present in the specimen 20A. However, in addition to these, the specimen information may also include target information for each specimen 20A, specifically, the type of tissue used, such as organs, cells, blood, bodily fluids, ascites, pleural effusion, or the like, the type of diseases targeted, attributes of the subject, such as age, gender, blood type, or race, or information regarding the subject's lifestyle habits, such as diet, exercise habits, or smoking habits. The information including the intrinsic measurement channels and spectral information of the autofluorescent component present in the specimen 20A, as well as the target information, may be associated with each specimen 20A. This makes it possible to easily trace information including the intrinsic measurement channels and spectral information of the autofluorescent component present in the specimen 20A from the target information, and this enables the execution of similar separation processing operations previously performed by the analysis unit 131, for example, on the basis of the similarity of target information across multiple specimens 20A, thus reducing measurement time. Moreover, the “tissue used” is not particularly limited to tissue collected from a subject but may include in-vivo tissues and cell lines of humans, animals, or the like, as well as solutions, solvents, solutes, and materials contained in the object of measurement.
In addition, the reagent information managed by the database 200, as described above, includes information containing the spectral information of the fluorescent reagent 10A. However, in addition to this, the reagent information may include information regarding the fluorescent reagent 10A, such as production lot, fluorescent components, antibodies, clones, fluorescence labeling rate, quantum yield, photobleaching coefficient, and absorption cross-section or molar absorptivity coefficient. The photobleaching coefficient is information indicating the ease with which the fluorescence intensity of the fluorescent reagent 10A diminishes. Furthermore, the specimen information and reagent information managed by the database 200 may be managed in different configurations, and in particular, information regarding reagents may be a reagent database that presents the users with the optimal combination of reagents.
It is herein assumed that the specimen information and the reagent information are either provided by a manufacturer or the like, or independently measured within the information processing system according to the present disclosure. For example, the manufacturer of the fluorescent reagent 10A often does not measure and provide the spectral information, fluorescent labeling rate, or the like for each production lot. Thus, independently measuring and managing such information within the information processing system according to the present disclosure makes it possible to improve the separation accuracy between fluorescent signal and autofluorescent signal. Additionally, for the sake of simplifying management, the database 200 may use a catalog value publicly available from manufacturers or the like or a value documented in various literature sources as specimen information and reagent information, especially as reagent information. However, actual specimen information and reagent information are often different from the catalog value or literature value, so it is generally preferable for specimen information and reagent information to be independently measured and managed within the information processing system according to the present disclosure, as described above.
Further, using machine learning technology or the like that employs the specimen information, reagent information, and analysis processing results managed in the database 200 makes it possible to improve, for example, the accuracy of analysis processing, such as the separation of fluorescent signals from the autofluorescence signal. Although the component that performs learning using machine learning technology or the like is not limited to a particular one, and in the present embodiment, a case where the analysis unit 131 of the information processing apparatus 100 performs the learning will be described as an example. For example, the analysis unit 131 uses a neural network to create a classifier or estimator that has been machine-learned with learning data linking the separated fluorescent signals and autofluorescent signals to the image information, specimen information, and reagent information used for separation. Then, in the case where new image information, specimen information, and reagent information are acquired, the analysis unit 131 is capable of inputting these types of information into the classifier or estimator to predict and output the fluorescence signal and autofluorescence signal included in the image information.
Further, it may be possible to calculate previously performed similar separation processing operations with higher accuracy than the predicted fluorescent signal and autofluorescent signal, statistically or regressively analyze the details of those processing operations, and output a method of improving the separation processing of the fluorescent signal and autofluorescent signal on the basis of the analysis result. The separation processing is, for example, separation processing in which similar types of image information, specimen information, and reagent information are used. The details of the processing include, for example, information and a parameter used in the processing. Moreover, the machine learning method is not limited to the above examples and may employ known machine learning techniques. Additionally, the separation processing between the fluorescent signal and the autofluorescent signal may also be performed by artificial intelligence. Additionally, various types of processing using the fluorescent signal or autofluorescent signal after separation, such as analysis of the fixation or immobilization state of the specimen 20A or segmentation, or the like, may be improved by machine learning techniques.
The description above is given on the configuration example of the information processing system according to the present embodiment. Moreover, the configuration described with reference to
Additionally, the information processing apparatus 100 may perform processing operations other than those described above. For example, by including information such as the quantum yield, fluorescent labeling rate, absorption cross-section, or molar absorptivity coefficient regarding the fluorescent reagent 10A in the reagent information, the information processing apparatus 100 may calculate the number of fluorescent molecules in the image information, the number of antibodies bound to fluorescent molecules, or the like using the image information from which the autofluorescence signal is removed, along with the reagent information.
A basic processing example of the information processing apparatus 100 according to the present embodiment is described with reference to
As illustrated in
In step S1003, the image acquisition unit 112 of the information processing apparatus 100 captures an image of the fluorescent-stained specimen 30A, thereby acquiring image information (e.g., a fluorescence-stained specimen image). In step S1004, the information acquisition unit 111 acquires the reagent information and the specimen information from the database 200, based on the reagent identification information 11A attached to the fluorescent reagent 10A used to produce the fluorescent-stained specimen 30A and the specimen identification information 21A attached to the specimen 20A.
In step S1005, the analysis unit 131 separates the autofluorescence signal of the specimen 20A and the fluorescence signal of the fluorescent reagent 10A from the image information on the basis of the specimen information and the reagent information. In this regard, if the fluorescent signal includes signals of a plurality of fluorescent dyes (Yes in step S1006), the analysis unit 131 separates the fluorescent signal of each fluorescent dye in step S1007. Moreover, if the fluorescent signal does not include signals of a plurality of fluorescent dyes (No in step S1006), the separation processing on the fluorescent signal of each fluorescent dye is not performed in step S1007.
In step S1008, the image generation unit 132 generates the image information using the fluorescent signal separated by the analysis unit 131. For example, the image generation unit 132 generates the image information from which the autofluorescent signal is removed, or the image information in which the fluorescent signal is displayed for each fluorescent dye. In step S1009, the guide image generation unit 133 generates the guide image, and the correction unit 134 performs NR correction on the color-separated image, for example, using the guide image. In step S1010, the display unit 140 displays the image information corrected by the correction unit 134, thereby ending the series of processing operations.
Moreover, the respective steps in the flowchart of
In one example, instead of separating the autofluorescence signal of the specimen 20A and the fluorescence signal of the fluorescent reagent 10A from the image information in step S1005 and then separating the fluorescent signal of each fluorescent dye in step S1007, the analysis unit 131 may directly separate the fluorescence signal of each fluorescent dye from the image information. Additionally, after separating the fluorescence signal of each fluorescent dye from the image information, the analysis unit 131 may separate the autofluorescence signal of the specimen 20A from the image information.
Further, the information processing apparatus 100 may also execute processing not illustrated in
An example of fluorescence separation processing according to the present embodiment is described with reference to
As illustrated in
The concatenation unit 1311 is configured to generate a concatenated fluorescence spectrum by concatenating at least a portion of the plurality of fluorescence spectra acquired by the image acquisition unit 112 in the wavelength direction. For example, the concatenation unit 1311 extracts data of a predetermined width in each fluorescence spectrum, ensuring that each fluorescence spectrum includes the maximum fluorescence intensity value obtained from four fluorescence spectra (labeled as A to D in
In this event, the concatenation unit 1311 aligns the intensities of excitation light corresponding to the respective multiple fluorescence spectra on the basis of the intensity of the excitation light, in other words, corrects the multiple fluorescence spectra, before proceeding with the aforementioned concatenation. More specifically, the concatenation unit 1311 aligns the intensities of the excitation light corresponding to the respective multiple fluorescence spectra by dividing each fluorescence spectrum by the excitation power density, which is the intensity of the excitation light, before performing the aforementioned concatenation. This allows for the acquisition of the fluorescence spectrum in the case of being irradiated with excitation light of the same intensity. Moreover, in the case where the intensity of the irradiated excitation light varies, the intensity of the spectrum absorbed by the fluorescent-stained specimen 30A also varies according to the intensity. This spectrum will be herein referred to as an “absorption spectrum”. Thus, as described above, aligning the intensities of the excitation light corresponding to the respective multiple fluorescence spectra makes it possible to appropriately evaluate the absorption spectrum.
In this regard, labels A to D in
Specifically, the concatenation unit 1311 extracts a fluorescence spectrum SP1 from the fluorescence spectrum illustrated in A of
Moreover, although
Further, as described herein, the intensity of the excitation light may be the excitation power or excitation power density, as mentioned above. The excitation power or excitation power density may be the power or power density obtained by actually measuring the excitation light emitted from the light source, or it may be the power or power density determined from the driving voltage applied to the light source. Moreover, the intensity of excitation light herein may be a value obtained by correcting the above-mentioned excitation power density with the absorption rate of the excitation light by the section being observed, or the amplification rate of the detection signal in the detection system for detecting the fluorescence emitted from the section, for example, in the image acquisition unit 112 or the like. In other words, the intensity of the excitation light herein may be the power density of the excitation light that actually contributes to the excitation of the fluorescent substance, or the value obtained by correcting the power density by the amplification rate of the detection system or the like. By considering absorption rate, amplification factor, or the like, it is possible to appropriately correct the intensity of excitation light that varies depending on fluctuations in machine conditions, environment, or the like, thereby enabling the generation of concatenated fluorescence spectra that allow for higher precision in color separation.
Moreover, the correction value based on the intensity of excitation light for each fluorescence spectrum is not limited to a value to align the intensity of excitation light corresponding to each of a plurality of fluorescence spectra, and may be modified in various ways. The above-mentioned correction value is also referred to as an intensity correction value. For example, the signal intensity of a fluorescence spectrum that has an intensity peak on the longer wavelength side tends to be lower than the signal intensity of a fluorescence spectrum that has an intensity peak on the shorter wavelength side. Thus, in the case where a concatenated fluorescence spectrum includes both a fluorescence spectrum with an intensity peak on the longer wavelength side and a fluorescence spectrum with an intensity peak on the shorter wavelength side, there is a tendency for the fluorescence spectrum with an intensity peak on the longer wavelength side to be almost disregarded, and only the fluorescence spectrum with an intensity peak on the shorter wavelength side is extracted. In such cases, for example, by setting a larger intensity correction value for a fluorescence spectrum having an intensity peak on the long wavelength side, it is also possible to improve the separation precision of the fluorescence spectrum with an intensity peak on the shorter wavelength side.
The color separation unit 1321 includes, for example, a first color separation unit 1321a and a second color separation unit 1321b, and separates the concatenated fluorescence spectrum of the stained section that is input from the concatenation unit 1311 into a color for each molecule. The stained section is also referred to as a stained sample.
More specifically, the first color separation unit 1321a performs color separation processing, which uses a concatenated fluorescence reference spectrum included in the reagent information and a concatenated autofluorescence reference spectrum included in the specimen information input from the information storage unit 121, on the concatenated fluorescence spectrum of the stained sample input from the concatenation unit 1311, thereby separating the concatenated fluorescence spectrum into spectra for respective molecules. Moreover, for color separation processing, for example, techniques such as least squares method (LSM), weighted least squares method (WLSM), non-negative matrix factorization (NMF), non-negative matrix factorization employing Gram matrix tAA, or the like, may be used.
The second color separation unit 1321b performs color separation processing, which uses the concatenated autofluorescence reference spectrum after adjustment that is input from the spectral extraction unit 1322, on the concatenated fluorescence spectrum of the stained sample input from the concatenation unit 1311, thereby separating the concatenated fluorescence spectra into a spectrum for each molecule. Moreover, for color separation processing, similar to the first color separation unit 1321a, for example, techniques, such as the least squares method (LSM), weighted least squares method (WLSM), non-negative matrix factorization (NMF), or non-negative matrix factorization employing Gram matrix AA, or the like, may be used.
In this regard, the least squares method, for example, calculates the mixing ratio by fitting the concatenated fluorescence spectrum, which is generated by the concatenation unit 1311, to the reference spectrum. Furthermore, in the weighted least squares method, the noise in the concatenated fluorescence spectrum (Signal), which is a measured value, is assumed to follow a Poisson distribution, and a weight is applied to emphasize errors at low signal levels. However, the upper limit at which weighting is not applied in the weighted least squares method, is set as the Offset value. The Offset value is determined by the characteristics of the sensor used for measurement, and if an image sensor is used as the sensor, separate optimization is required.
The spectral extraction unit 1322 is configured to improve the concatenated autofluorescence reference spectrum so that more accurate color separation results can be obtained, and the spectral extraction unit 1322 adjusts the concatenated autofluorescence included in the specimen information input from the information storage unit 121 to one that allows a more accurate color separation result to be obtained, based on the color separation result by the color separation unit 1321.
The spectral extraction unit 1322 executes spectral extraction processing using the color separation result input from the first color separation unit 1321a on the concatenated autofluorescence reference spectrum input from the information storage unit 121, and adjusts the concatenated autofluorescence reference spectrum on the basis of the result, improving the concatenated autofluorescence reference spectrum to achieve more accurate color separation result. Moreover, for example, in the spectral extraction processing, techniques such as non-negative matrix factorization (NMF), singular value decomposition (SVD), or the like may be used.
Moreover, although
As described above, the first color separation unit 1321a and the second color separation unit 1321b perform fluorescence separation processing using reference spectra concatenated in the wavelength direction (concatenated autofluorescence reference spectrum and concatenated fluorescence reference spectrum), allowing the output of a unique spectrum as the separation result. The separation results are not separated for each excitation wavelength. Thus, the practitioner can obtain the correct spectrum more easily. Additionally, the reference spectrum concerning autofluorescence used for separation (concatenated autofluorescence reference spectrum) is automatically acquired, and by performing fluorescence separation processing, the practitioner no longer needs to extract the spectrum corresponding to autofluorescence from the appropriate space of the unstained tissue section.
An example of NR correction processing using a guide image according to the present embodiment is described with reference to
As illustrated in
In cell analysis, for example, the display unit 140 displays the NR-corrected image. This allows the user to visually recognize the NR-corrected image. Furthermore, the processing-target image for NR correction processing may be either a single image or multiple images. Such a processing-target image is, for example, selected and set by the user. In this event, the user, for example, performs an input operation on the operation unit 160 to select or change the processing-target image.
As illustrated in
As illustrated in
As for image processing, for example, it is possible to use techniques such as noise removal processing or edge enhancement processing. In the noise removal processing, for example, a filter such as a median filter, a mean filter, or a Gaussian filter can be used as the noise removal filter. Additionally, in the edge enhancement processing, a filter such as Deconvwnr, Deconvreg, Deconvlucy, Deconvblind, first-order derivative filter, or second-order derivative filter can be used as the edge enhancement filter.
As illustrated in
In this second processing example, a plurality of multispectral images (e.g., color-separated images) is merged and divided upon creating the guide image and then image processing such as noise removal processing and edge enhancement processing is applied to the merged and divided image, which allows the guide image to have a higher S/N ratio. Thus, enabling the guide image to have a higher S/N ratio makes it possible to enhance the NR effect.
In this regard, as illustrated in
Moreover, in the second processing example, although the guide image generation unit 133 performs image processing after summing up a plurality of multispectral images and performing a division on the result, this is not limited to such an example, and for example, it is also possible to perform image processing after summing up the plurality of multispectral images and before performing the division.
As illustrated in
In this third processing example, a plurality of multispectral images (e.g., color-separated images) in the case of creating the guide image is preprocessed by zeroing out a pixel equal to or less than a predetermined positive threshold, and then the plurality of multispectral images after zero-filling is merged and divided to generate the guide image with a higher S/N ratio. Thus, enabling the guide image to have a higher S/N ratio makes it possible to enhance the NR effect.
The guide image generation unit 133 is capable of determining the positive threshold for the multispectral image, such as a stained fluorescent component image D2 (see
According to this example, the positive threshold is determined on the basis of the unstained fluorescent component image D22 obtained from an unstained specimen fluorescence spectrum D21, which is used as a negative control group. Thus, in the stained fluorescent component image D2, it is possible to accurately distinguish an image section affected by the fluorescence caused by the fluorescent reagent 10A from those unaffected by such fluorescence, and identify it as a positive cell image.
The guide image generation unit 133 may, for example, determine a luminance value (referred to as “T” in
For example, the guide image generation unit 133 may determine the maximum luminance value of the unstained fluorescent component image D22 as the edge of the histogram of the unstained fluorescent component image D22. Alternatively, the guide image generation unit 133 may calculate the slope of the gradient (referred to as “G” in
As illustrated in
In this fourth processing example, in the case of creating the guide image, a pixel equal to or less than the positive threshold is zeroed in advance and the result is merged and divided, and then image processing such as noise removal processing and edge enhancement processing is performed on the merged and divided image, thereby enabling the guide image to have a higher S/N ratio. Thus, enabling the guide image to have a higher S/N ratio makes it possible to enhance the NR effect.
As illustrated in
On the other hand, as illustrated in
Moreover, the degree of NR effect varies depending on the type of processing-target images, but for example, if the processing-target image is a multispectral image subjected to color separation, then the procedure of image processing in
As illustrated in
In this fifth processing example, in the case of creating the guide image, only images corresponding to a specific cell type, such as membrane-stained markers, are merged and divided, thereby enabling the guide image to have a higher S/N. Thus, enabling the guide image to have a higher S/N ratio makes it possible to enhance the NR effect.
As illustrated in
In this sixth processing example, in the case of creating the guide image, only images corresponding to a specific cell type, such as only membrane-stained markers, are merged and divided, and then image processing such as noise removal processing or edge enhancement processing is performed on the merged and divided images, thereby enabling the guide image to have a higher S/N ratio. Thus, enabling the guide image to have a higher S/N ratio makes it possible to enhance the NR effect.
As illustrated in
In this seventh processing example, in the case of creating the guide image, a pixel equal to or less than a predetermined positive threshold is zeroed out only in the images corresponding to a specific cell type, and then only these zeroed images are merged and divided to enable the guide image to have a higher S/N ratio. Thus, enabling the guide image to have a higher S/N ratio makes it possible to enhance the NR effect.
As illustrated in
According to this eighth processing example, in the case of creating the guide image, by zeroing a pixel equal to or less than a predetermined positive threshold in the images corresponding to a specific cell type, then merging only those zeroed images and dividing the result, and subsequently applying image processing such as noise removal processing, edge enhancement processing, or the like to the merged and divided image, so enabling the guide image to have a higher S/N ratio. Thus, enabling the guide image to have a higher S/N ratio makes it possible to enhance the NR effect.
As illustrated in
According to this ninth processing example, in the case of creating the guide image, the result of cell analysis (e.g., such as positive cell rate or number of positive cells) are used as weights and merged. This enables the incorporation of cell analysis results into the guide image creation.
As illustrated in
According to this tenth processing example, in the case of creating the guide image, the guide image is automatically weighted until the positive rate is the same as that of the same cell type marker, and then merged. This eliminates the need for human judgment, enabling automation.
As described above, according to the present embodiment, the information processing apparatus 100 includes the guide image generation unit 133 that generates a guide image for correction by summing up a plurality of images (e.g., color-separated images), each containing spectral information pertaining to a biomarker, and performing a division by the number of summed images. This configuration makes it possible to perform NR correction on the processing-target image using the guide image, thereby enabling the acquisition of a necessary signal obscured or buried in the background of the processing-target image while preserving the requisite signal intensity for analysis.
Additionally, the information processing apparatus 100 may further include the correction unit 134, which performs noise reduction correction on the processing-target image using the guide image. This configuration makes it possible to reliably obtain the necessary signal obscured or buried in the background of the processing-target image while preserving the signal intensity necessary for analysis.
Furthermore, the correction unit 134 may perform outlier processing on the processing-target image before the noise reduction correction. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Further, the guide image generation unit 133 may perform image processing after summing up a plurality of images and performing a division on the result. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Furthermore, the guide image generation unit 133 may perform image processing after summing up the plurality of images and before performing a division on the result. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Additionally, the guide image generation unit 133 may perform processing of zeroing out a pixel that is equal to or less than a predetermined positive threshold value for a plurality of images before summing up the plurality of images. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Further, the guide image generation unit 133 may perform processing of zeroing out a pixel that is equal to or less than a predetermined positive threshold for a plurality of images, and may perform image processing after summing up the plurality of images and performing a division on the result. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Additionally, the guide image generation unit 133 may perform processing of zeroing out a pixel that is equal to or less than a predetermined positive threshold for a plurality of images, and may perform image processing after summing up the plurality of images and before performing a division on the result. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Furthermore, the guide image generation unit 133 may sum up only images corresponding to a specific cell tumor out of the plurality of images. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Further, the guide image generation unit 133 may perform image processing after summing up only images corresponding to the specific cell tumor out of the plurality of images and performing a division on the result. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Furthermore, the guide image generation unit 133 may perform image processing after summing up only images corresponding to the specific cell tumor out of the plurality of images and performing a division on the result. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
In addition, the guide image generation unit 133 may perform zero-filling processing of zeroing a pixel equal to or less than a predetermined positive threshold on the images corresponding to the specific cell tumor out of the plurality of images, and may sum up only the images corresponding to the specific cell tumor after the zero-filling processing. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Additionally, the guide image generation unit 133 may perform image processing after summing up only the images corresponding to the specific cell tumor after the zero-filling processing and performing a division on the result. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Further, the guide image generation unit 133 may perform image processing after summing up only the images corresponding to the specific cell tumor after the zero-filling processing and before performing a division on the result. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
Furthermore, the guide image generation unit 133 may sum up a plurality of images using the analysis result for the processing-target image as a weight. This configuration enables the incorporation of analysis results (e.g., cell analysis results) into the guide image creation.
Additionally, the guide image generation unit 133 may repeatedly sum up a plurality of images using the analysis result as a weight until the analysis result becomes comparable to that of a comparison target. This eliminates the need for human judgment, enabling automation.
Further, each of the plurality of images may be a color-separated image. Even if each image is a color-separated image, it is possible to acquire a necessary signal obscured or buried in the background of the processing-target image while preserving the signal intensity necessary for analysis.
Additionally, the guide image generation unit 133 may perform image processing using a noise removal filter and an edge enhancement filter. This configuration allows the guide image to have a higher S/N ratio, thereby enhancing the NR effect.
The processing according to the embodiments or modifications described above may be implemented in various different forms or modifications beyond the embodiment described above. For example, out of the processing described in the embodiments mentioned above, the entirety or a part of the processing operations described as being performed automatically can be performed manually, or the processing operations described as being performed manually can also be performed manually, or the entirety or a part of the processing operations described as being performed manually can be performed automatically using known methods. In addition, unless specifically stated otherwise, the processing procedures, specific names, and information including various data and parameters illustrated in the specification and drawings can be changed optionally. For example, the various types of information illustrated in each figure is not limited to the information illustrated.
Furthermore, each component of respective apparatuses or devices illustrated in the drawings represents a functional concept and may not necessarily be configured physically as illustrated. In other words, the specific form of distributing and integrating each apparatus or device is not limited to what is illustrated in the drawings, and the entirety or a part of the apparatuses or devices can be functionally or physically distributed or integrated into optional units depending on various loads and usage conditions.
Moreover, the embodiments or modifications described above can be combined as appropriate within the range that does not conflict with the processing details. Furthermore, the effects described in this specification are merely exemplary and are not limited, and other effects also be achievable.
The technology according to the present disclosure can be applied to, for example, a fluorescence observation apparatus 500 (an example of a microscope system) or the like. Hereinafter, a configuration example of an applicable fluorescence observation apparatus 500 will be described with reference to
As shown in
The observation unit 1 includes an excitation unit (irradiation unit) 10, a stage 20, a spectral imaging unit 30, an observation optical system 40, a scanning mechanism 50, a focus mechanism 60, and a non-fluorescence observing unit 70.
The excitation unit 10 irradiates the observation target with a plurality of beams of irradiation light having different wavelengths. For example, the excitation unit 10 irradiates a pathological specimen (pathological sample), which is the observation target, with a plurality of line illuminations having different wavelengths arranged in parallel with different axes. The stage 20 is a table that supports the pathological specimen, and is configured to be movable in a direction perpendicular to the direction of line light by the line illuminations by the scanning mechanism 50. The spectral imaging unit 30 includes a spectroscope and acquires a fluorescence spectrum (spectroscopic data) of the pathological specimen excited linearly by the line illuminations.
That is, the observation unit 1 functions as a line spectroscope that acquires spectroscopic data corresponding to the line illuminations. Further, the observation unit 1 also functions as an imaging device that captures a plurality of fluorescence images generated by an imaging target (pathological specimen) for each of a plurality of fluorescence wavelengths for each line and acquires data of the plurality of captured fluorescence images in an arrangement order of the lines.
Here, parallel with different axis means that the plurality of line illuminations has different axes and are parallel. The different axes mean that the axes are not coaxial, and the distance between the axes is not particularly limited. The parallel is not limited to parallel in a strict sense, and includes a state of being substantially parallel. For example, there may be distortion originated from an optical system such as a lens or deviation from a parallel state due to manufacturing tolerance, and this case is also regarded as parallel.
The excitation unit 10 and the spectral imaging unit 30 are connected to the stage 20 via the observation optical system 40. The observation optical system 40 has a function of following an optimum focus by the focus mechanism 60. The non-fluorescence observing unit 70 for performing dark field observation, bright field observation, and the like may be connected to the observation optical system 40. In addition, a control unit 80 that controls the excitation unit 10, the spectral imaging unit 30, the scanning mechanism 50, the focus mechanism 60, the non-fluorescence observing unit 70, and the like may be connected to the observation unit 1.
The process unit 2 includes a storing unit 21, a data calibration unit 22, and an image formation unit 23. The process unit 2 typically forms an image of the pathological specimen or outputs a distribution of the fluorescence spectrum on the basis of the fluorescence spectrum of the pathological specimen (hereinafter also referred to as a sample S) acquired by the observation unit 1. The image referred to herein refers to a constituent ratio of autofluorescence derived from a dye or a sample, or the like constituting the spectrum, an image converted from waveforms into RGB (red, green, and blue) color, a luminance distribution in a specific wavelength band, and the like.
The storing unit 21 includes a nonvolatile storage medium such as a hard disk drive or a flash memory, and a storage control unit that controls writing and reading of data to and from the storage medium. The storing unit 21 stores spectroscopic data indicating a correlation between each wavelength of light emitted by each of the plurality of line illuminations included in the excitation unit 10 and fluorescence received by the camera of the spectral imaging unit 30. Further, the storing unit 21 stores in advance information indicating a standard spectrum of autofluorescence related to a sample (pathological specimen) to be observed and information indicating a standard spectrum of a single dye staining the sample.
The data calibration unit 22 configures the spectroscopic data stored in the storing unit 21 on the basis of the captured image captured by the camera of the spectral imaging unit 30. The image formation unit 23 forms a fluorescence image of the sample on the basis of the spectroscopic data and an interval Δy of the plurality of line illuminations irradiated by the excitation unit 10. For example, the process unit 2 including the data calibration unit 22, the image formation unit 23, and the like is implemented by hardware elements used in a computer such as a central processing unit (CPU), a random access memory (RAM), and a read only memory (ROM), and a necessary program (software). Instead of or in addition to the CPU, a programmable logic device (PLD) such as a field programmable gate array (FPGA), a digital signal processor (DSP), an application specific integrated circuit (ASIC), or the like may be used.
The display unit 3 displays, for example, various types of information such as an image based on the fluorescence image formed by the image formation unit 23. The display unit 3 may include, for example, a monitor integrally attached to the process unit 2, or may be a display device connected to the process unit 2. The display unit 3 includes, for example, a display element such as a liquid crystal device or an organic EL device, and a touch sensor, and is configured as a user interface (UI) that displays input settings of image-capturing conditions, a captured image, and the like.
Next, details of the observation unit 1 will be described with reference to
As shown in
Furthermore, the excitation unit 10 includes a plurality of collimator lenses 11, a plurality of laser line filters 12, a plurality of dichroic mirrors 13a, 13b, and 13c, a homogenizer 14, a condenser lens 15, and an incident slit 16 so as to correspond to each of the excitation light sources L1 to L4.
The laser light emitted from the excitation light source L1 and the laser light emitted from the excitation light source L3 are collimated by the collimator lens 11, transmitted through the laser line filter 12 for cutting a skirt of each wavelength band, and made coaxial by the dichroic mirror 13a. The two coaxial laser lights are further beam-shaped by the homogenizer 14 such as a fly-eye lens and the condenser lens 15 so as to be the line illumination Ex1.
Similarly, the laser light emitted from the pumping light source L2 and the laser light emitted from the excitation light source L4 are coaxial by the dichroic mirrors 13b and 13c, and line illumination is performed so that the line illumination Ex2 is different in axis from the line illumination Ex1. The line illuminations Ex1 and Ex2 form line illuminations with different axes (primary image), which are separated by a distance Δy in the incident slit 16 (slit conjugate) having a plurality of slit portions through which each of the line illuminations Ex1 and Ex2 can pass.
Note that, in the present embodiment, an example in which the four lasers have two coaxial axes and two different axes will be described, but in addition to this, the two lasers may have two different axes or the four lasers may have four different axes.
The sample S on the stage 20 is irradiated with the primary image via the observation optical system 40. The observation optical system 40 includes a condenser lens 41, dichroic mirrors 42 and 43, an objective lens 44, a band pass filter 45, and a condenser lens (an example of an imaging lens) 46. The line illuminations Ex1 and Ex2 are collimated by the condenser lens 41 paired with the objective lens 44, reflected by the dichroic mirrors 42 and 43, transmitted through the objective lens 44, and irradiates the sample S on the stage 20.
Here,
The line illuminations Ex1 and Ex2 are formed on the surface of the sample S as shown in
As shown in
In the example of
The observation slit 31 is disposed at the condensing point of the condenser lens 46, and has the same number of (two this example) slit portions as the number of excitation lines. The fluorescence spectra derived from the two excitation lines that have passed through the observation slit 31 are separated by the first prism 33 and reflected by a grating surface of the diffraction grating 35 via the mirror 34, so that the fluorescence spectra are further separated into fluorescence spectra of respective excitation wavelengths. The four separated fluorescence spectra are incident on the imaging elements 32a and 32b via the mirror 34 and the second prism 36, and are developed as spectroscopic data into spectroscopic data (x, λ) expressed by the position x in the line direction and the wavelength λ. The spectroscopic data (x, λ) is a pixel value of a pixel at a position x in a row direction and at a position of a wavelength λ in a column direction among pixels included in the imaging element 32. Note that the spectroscopic data (x, λ) may be simply described as spectroscopic data.
Note that the pixel size (nm/Pixel) of the imaging elements 32a and 32b is not particularly limited, and is set, for example, equal to or more than 2 (nm/Pixel) and equal to or less than 20 (nm/Pixel). This dispersion value may be achieved optically or at a pitch of the diffraction grating 35, or may be achieved by using hardware binning of the imaging elements 32a and 32b. In addition, the dichroic mirror 42 and the band pass filter 45 are inserted in the middle of the optical path so that the excitation light (line illuminations Ex1 and Ex2) does not reach the imaging element 32.
Each of the line illuminations Ex1 and Ex2 is not limited to the case of being configured with a single wavelength, and each may be configured with a plurality of wavelengths. When the line illuminations Ex1 and Ex2 are each formed by a plurality of wavelengths, the fluorescence excited by these also includes a plurality of spectra. In this case, the spectral imaging unit 30 includes a wavelength dispersion element for separating the fluorescence into a spectrum derived from the excitation wavelength. The wavelength dispersion element includes a diffraction grating, a prism, or the like, and is typically disposed on an optical path between the observation slit 31 and the imaging element 32.
Note that the stage 20 and the scanning mechanism 50 constitute an X-Y stage, and move the sample S in the X-axis direction and the Y-axis direction in order to acquire a fluorescence image of the sample S. In the whole slide imaging (WSI), an operation of scanning the sample S in the Y-axis direction, then moving the sample S in the X-axis direction, and further performing scanning in the Y-axis direction is repeated. By using the scanning mechanism 50, it is possible to continuously acquire dye spectra (fluorescence spectra) excited at different excitation wavelengths, which are spatially separated by the distance Δy on the sample S (observation target Sa) in the Y-axis direction.
The scanning mechanism 50 changes the position irradiated with the irradiation light in the sample S over time. For example, the scanning mechanism 50 scans the stage 20 in the Y-axis direction. The scanning mechanism 50 can cause the stage 20 to scan the plurality of line illuminations Ex1 and Ex2 in the Y-axis direction, that is, in the arrangement direction of the line illuminations Ex1 and Ex2. This is not limited to this example, and the plurality of line illuminations Ex1 and Ex2 may be scanned in the Y-axis direction by a galvano mirror disposed in the middle of the optical system. Since the data derived from each of the line illuminations Ex1 and Ex2 (for example, the two-dimensional data or the three-dimensional data) is data whose coordinates are shifted by the distance Δy with respect to the Y axis, the data is corrected and output on the basis of the distance Δy stored in advance or the value of the distance Δy calculated from the output of the imaging element 32.
As shown in
The light source 71 is disposed on the side facing the objective lens 44 with respect to the stage 20, and irradiates the sample S on the stage 20 with illumination light from the side opposite to the line illuminations Ex1 and Ex2. In a case of the dark field illumination, the light source 71 illuminates from the outside of the NA (numerical aperture) of the objective lens 44, and light (dark field image) diffracted by the sample S is imaged by the imaging element 73 via the objective lens 44, the dichroic mirror 43, and the condenser lens 72. By using dark field illumination, even a apparently transparent sample such as a fluorescently-stained sample can be observed with contrast.
Note that this dark field image may be observed simultaneously with fluorescence and used for real-time focusing. In this case, as the illumination wavelength, a wavelength that does not affect fluorescence observation may be selected. The non-fluorescence observing unit 70 is not limited to the observation system that acquires a dark field image, and may be configured by an observation system that can acquire a non-fluorescence image such as a bright field image, a phase difference image, a phase image, and an in-line hologram image. For example, as a method for acquiring a non-fluorescence image, various observation methods such as a Schlieren method, a phase difference contrast method, a polarization observation method, and an epi-illumination method can be employed. The position of the illumination light source is not limited to below the stage 20, and may be above the stage 20 or around the objective lens 44. In addition, not only a method of performing focus control in real time, but also another method such as a prefocus map method of recording focus coordinates (Z coordinates) in advance may be employed.
Note that, in the above description, the line illumination as the excitation light includes two line illuminations Ex1 and Ex2 but is not limited thereto, and may be three, four, or five or more. In addition, each line illumination may include a plurality of excitation wavelengths selected so that the color separation performance is not degraded as much as possible. Further, even if there is one line illumination, if it is an excitation light source including a plurality of excitation wavelengths and each excitation wavelength is recorded in association with the data acquired by the imaging element 32, it is possible to obtain a polychromatic spectrum although it is not possible to obtain separability to be parallel to different axes.
The application example in which the technology according to the present disclosure is applied to the fluorescence observation apparatus 500 has been described above. Note that the above-described configuration described with reference to
The technology according to the present disclosure can be applied to, for example, a microscope system and the like. Hereinafter, a configuration example of a microscope system 5000 that can be applied will be described with reference to
The microscope system 5000 may be designed as a so-called whole slide imaging (WSI) system or a digital pathology imaging system, and can be used for pathological diagnosis. Alternatively, the microscope system 5000 may be designed as a fluorescence imaging system, or particularly, as a multiple fluorescence imaging system.
For example, the microscope system 5000 may be used to make an intraoperative pathological diagnosis or a telepathological diagnosis. In the intraoperative pathological diagnosis, the microscope device 5100 can acquire the data of the biological sample S acquired from the subject of the operation while the operation is being performed, and then transmit the data to the information processing unit 5120. In the telepathological diagnosis, the microscope device 5100 can transmit the acquired data of the biological sample S to the information processing unit 5120 located in a place away from the microscope device 5100 (such as in another room or building). In these diagnoses, the information processing unit 5120 then receives and outputs the data. On the basis of the output data, the user of the information processing unit 5120 can make a pathological diagnosis.
The biological sample S may be a sample containing a biological component. The biological component may be a tissue, a cell, a liquid component of the living body (blood, urine, or the like), a culture, or a living cell (a myocardial cell, a nerve cell, a fertilized egg, or the like). The biological sample may be a solid, or may be a specimen fixed with a fixing reagent such as paraffin or a solid formed by freezing. The biological sample can be a section of the solid. A specific example of the biological sample may be a section of a biopsy sample.
The biological sample may be one that has been subjected to a treatment such as staining or labeling. The treatment may be staining for indicating the morphology of the biological component or for indicating the substance (surface antigen or the like) contained in the biological component, and can be hematoxylin-eosin (HE) staining or immunohistochemistry staining, for example. The biological sample may be one that has been subjected to the above treatment with one or more reagents, and the reagent(s) can be a fluorescent dye, a coloring reagent, a fluorescent protein, or a fluorescence-labeled antibody.
The specimen may be prepared from a tissue sample for the purpose of pathological diagnosis or clinical examination. Alternatively, the specimen is not necessarily of the human body, and may be derived from an animal, a plant, or some other material. The specimen may differ in property, depending on the type of the tissue being used (such as an organ or a cell, for example), the type of the disease being examined, the attributes of the subject (such as age, gender, blood type, and race, for example), or the subject's daily habits (such as an eating habit, an exercise habit, and a smoking habit, for example). The specimen may be accompanied by identification information (bar code, QR code (registered trademark), or the like) for identifying each specimen, and be managed in accordance with the identification information.
The light irradiation unit 5101 is a light source for illuminating the biological sample S, and is an optical unit that guides light emitted from the light source to a specimen. The light source can illuminate a biological sample with visible light, ultraviolet light, infrared light, or a combination thereof. The light source may be one or more of the following: a halogen light source, a laser light source, an LED light source, a mercury light source, and a xenon light source. The light source in fluorescent observation may be of a plurality of types and/or wavelengths, and the types and the wavelengths may be appropriately selected by a person skilled in the art. The light irradiation 5101 unit may have a configuration of a transmissive type, a reflective type, or an epi-illumination type (a coaxial epi-illumination type or a side-illumination type).
The optical unit 5102 is designed to guide the light from the biological sample S to the signal acquisition unit 5103. The optical unit 5102 may be designed to enable the microscope device 5100 to observe or capture an image of the biological sample S. The optical unit 5102 may include an objective lens. The type of the objective lens may be appropriately selected by a person skilled in the art, in accordance with the observation method. The optical unit 5102 may also include a relay lens for relaying an image magnified by the objective lens to the signal acquisition unit 5103. The optical unit 5102 may further include optical components other than the objective lens and the relay lens, and the optical components may be an eyepiece, a phase plate, a condenser lens, and the like. The optical unit 5102 may further include a wavelength separation unit designed to separate light having a predetermined wavelength from the light from the biological sample S. The wavelength separation unit may be designed to selectively cause light having a predetermined wavelength or a predetermined wavelength range to reach the signal acquisition unit 5103. The wavelength separation unit may include one or more of the following: a filter, a polarizing plate, a prism (Wollaston prism), and a diffraction grating that selectively pass light, for example. The optical component(s) included in the wavelength separation unit may be disposed in the optical path from the objective lens to the signal acquisition unit 5103, for example. The wavelength separation unit is provided in the microscope device 5100 in a case where fluorescent observation is performed, or particularly, where an excitation light irradiation unit is included. The wavelength separation unit may be designed to separate fluorescence or white light from fluorescence.
The signal acquisition unit 5103 may be designed to receive light from the biological sample S, and convert the light into an electrical signal, or particularly, into a digital electrical signal. The signal acquisition unit 5103 may be designed to be capable of acquiring data about the biological sample S, on the basis of the electrical signal. The signal acquisition unit 5103 may be designed to be capable of acquiring data of an image (a captured image, or particularly, a still image, a time-lapse image, or a moving image) of the biological sample S, or particularly, may be designed to acquire data of an image enlarged by the optical unit 5102. The signal acquisition unit 5103 includes one or more image sensors, CMOSs, CCDs, or the like that include a plurality of pixels arranged in one- or two-dimensional manner. The signal acquisition unit 5103 may include an image sensor for acquiring a low-resolution image and an image sensor for acquiring a high-resolution image, or may include an image sensor for sensing for AF or the like and an image sensor for outputting an image for observation or the like. The image sensor may include not only the plurality of pixels, but also a signal processing unit (including one or more of the following: a CPU, a DSP, and a memory) that performs signal processing using pixel signals from the respective pixels, and an output control unit that controls outputting of image data generated from the pixel signals and processed data generated by the signal processing unit. The image sensor including the plurality of pixels, the signal processing unit, and the output control unit can be preferably designed as a one-chip semiconductor device. Note that the microscope system 5000 may further include an event detection sensor. The event detection sensor includes a pixel that photoelectrically converts incident light, and may be designed to detect that a change in the luminance of the pixel exceeds a predetermined threshold, and regard the change as an event. The event detection sensor may be of an asynchronous type.
The control unit 5110 controls imaging being performed by the microscope device 5100. For the imaging control, the control unit 5110 can drive movement of the optical unit 5102 and/or the sample placement unit 5104, to adjust the positional relationship between the optical unit 5102 and the sample placement unit 5104. The control unit 5110 can move the optical unit 5102 and/or the sample placement unit 5104 in a direction toward or away from each other (in the optical axis direction of the objective lens, for example). The control unit 5110 may also move the optical unit 5102 and/or the sample placement unit 5104 in any direction in a plane perpendicular to the optical axis direction. For the imaging control, the control unit 5110 may control the light irradiation unit 5101 and/or the signal acquisition unit 5103.
The sample placement unit 5104 may be designed to be capable of securing the position of a biological sample on the sample placement unit 5104, and may be a so-called stage. The sample placement unit 5104 may be designed to be capable of moving the position of the biological sample in the optical axis direction of the objective lens and/or in a direction perpendicular to the optical axis direction.
The information processing unit 5120 can acquire, from the microscope device 5100, data (imaging data or the like) acquired by the microscope device 5100. The information processing unit 5120 can perform image processing on the imaging data. The image processing may include an unmixing process, or more specifically, a spectral unmixing process. The unmixing process may include a process of extracting data of the optical component of a predetermined wavelength or in a predetermined wavelength range from the imaging data to generate image data, or a process of removing data of the optical component of a predetermined wavelength or in a predetermined wavelength range from the imaging data. The image processing may also include an autofluorescence separation process for separating the autofluorescence component and the dye component of a tissue section, and a fluorescence separation process for separating wavelengths between dyes having different fluorescence wavelengths from each other. The autofluorescence separation process may include a process of removing the autofluorescence component from image information about another specimen, using an autofluorescence signal extracted from one specimen of the plurality of specimens having the same or similar properties. The information processing unit 5120 may transmit data for the imaging control to the control unit 5110, and the control unit 5110 that has received the data may control the imaging being by the microscope device 5100 in accordance with the data.
The information processing unit 5120 may be designed as an information processing device such as a general-purpose computer, and may include a CPU, RAM, and ROM. The information processing unit 5120 may be included in the housing of the microscope device 5100, or may be located outside the housing. Further, the various processes or functions to be executed by the information processing unit 5120 may be realized by a server computer or a cloud connected via a network.
The method to be implemented by the microscope device 5100 to capture an image of the biological sample S may be appropriately selected by a person skilled in the art, in accordance with the type of the biological sample, the purpose of imaging, and the like. Examples of the imaging method are described below.
One example of the imaging method is as follows. The microscope device 5100 can first identify an imaging target region. The imaging target region may be identified so as to cover the entire region in which the biological sample exists, or may be identified so as to cover the target portion (the portion in which the target tissue section, the target cell, or the target lesion exists) of the biological sample. Next, the microscope device 5100 divides the imaging target region into a plurality of divided regions of a predetermined size, and the microscope device 5100 sequentially captures images of the respective divided regions. As a result, an image of each divided region is acquired.
As shown in
Another example of the imaging method is as follows. The microscope device 5100 can first identify an imaging target region. The imaging target region may be identified so as to cover the entire region in which the biological sample exists, or may be identified so as to cover the target portion (the portion in which the target tissue section or the target cell exists) of the biological sample. Next, the microscope device 5100 scans a region (also referred to as a “divided scan region”) of the imaging target region in one direction (also referred to as a “scan direction”) in a plane perpendicular to the optical axis, and thus captures an image. After the scanning of the divided scan region is completed, the divided scan region next to the scan region is then scanned. These scanning operations are repeated until an image of the entire imaging target region is captured. As shown in
A hardware configuration example of the information processing device 100 according to each embodiment (or each modification) will be described with reference to
As shown in
The CPU 901 functions as an arithmetic processing device and a control device, and controls the overall operation in the information processing device 100 according to various programs. In addition, the CPU 901 may be a microprocessor. The ROM 902 stores programs, operation parameters, and the like used by the CPU 901. The RAM 903 primarily stores programs used in the execution of the CPU 901, parameters that appropriately change in the execution, and the like. The CPU 901 can embody, for example, at least the processing unit 130 and the control unit 150 of the information processing device 100.
The CPU 901, the ROM 902, and the RAM 903 are mutually connected by a host bus 904a including a CPU bus and the like. The host bus 904a is connected to the external bus 904b such as a peripheral component interconnect/interface (PCI) bus via the bridge 904. Note that the host bus 904a, the bridge 904, and the external bus 904b do not necessarily need to be configured separately, and these functions may be mounted on one bus.
The input device 906 is implemented by, for example, a device to which information is input by an implementer, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever. Furthermore, the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA corresponding to the operation of the information processing device 100. Furthermore, the input device 906 may include, for example, an input control circuit that generates an input signal on the basis of information input by the implementer using the above input units and outputs the input signal to the CPU 901. By operating the input device 906, the implementer can input various data to the information processing device and instruct the information processing device 100 to perform a processing operation. The input device 906 can embody at least the operating unit 160 of the information processing device 100, for example.
The output device 907 is formed by a device capable of visually or audibly notifying the implementer of acquired information. Examples of such a device include a display device such as a CRT display device, a liquid crystal display device, a plasma display device, an EL display device, and a lamp, a sound output device such as a speaker and a headphone, and a printer device. The output device 907 can embody at least the display unit 140 of the information processing device 100, for example.
The storage device 908 is a device for storing data. The storage device 908 is achieved by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like. The storage device 908 may include a storage medium, a recording device that records data in the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded in the storage medium, and the like. The storage device 908 stores programs and various data executed by the CPU 901, various data acquired from the outside, and the like. The storage device 908 can embody at least the storage unit 120 of the information processing device 100, for example.
The drive 909 is a reader/writer for a storage medium, and is built in or externally attached to the information processing device 100. The drive 909 reads information recorded in a removable storage medium such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 903. Furthermore, the drive 909 can also write information to a removable storage medium.
The connection port 911 is an interface connected to an external device, and is a connection port to an external device capable of transmitting data by, for example, a universal serial bus (USB).
The communication device 913 is, for example, a communication interface formed by a communication device or the like for connecting to the network 920. The communication device 913 is, for example, a communication card for wired or wireless local area network (LAN), long term evolution (LTE), Bluetooth (registered trademark), wireless USB (WUSB), or the like. Furthermore, the communication device 913 may be a router for optical communication, a router for asymmetric digital subscriber line (ADSL), a modem for various types of communication, or the like. For example, the communication device 913 can transmit and receive signals and the like to and from the Internet and other communication devices according to a predetermined protocol such as TCP/IP.
In the present embodiment, the sensor 915 includes a sensor capable of acquiring a spectrum (for example, an imaging element or the like), but may include another sensor (for example, an acceleration sensor, a gyro sensor, a geomagnetic sensor, a pressure-sensitive sensor, a sound sensor, a distance measuring sensor, or the like). The sensor 915 can embody at least the image acquisition unit 112 of the information processing device 100, for example.
Note that the network 920 is a wired or wireless transmission path of information transmitted from a device connected to the network 920. For example, the network 920 may include a public network such as the Internet, a telephone network, or a satellite communication network, various local area networks (LANs) including Ethernet (registered trademark), a wide area network (WAN), or the like. In addition, the network 920 may include a dedicated line network such as an Internet protocol-virtual private network (IP-VPN).
The hardware configuration example capable of implementing the functions of the information processing device 100 has been described above. Each of the above-described components may be implemented using a general-purpose member, or may be implemented by hardware specialized for the function of each component. Therefore, it is possible to appropriately change the hardware configuration to be used according to the technical level at the time of implementing the present disclosure.
Note that a computer program for implementing each function of the information processing device 100 as described above can be created and mounted on a PC or the like. Furthermore, it is also possible to provide a computer-readable recording medium storing such a computer program. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. In addition, the computer program described above may be distributed via, for example, a network without using the recording medium.
Moreover, the present technology can also have the following configurations.
(1)
An information processing apparatus comprising:
a guide image generation unit configured to sum up a plurality of images each including spectral information regarding a biomarker, and perform a division on a result by a number of summed images to generate a guide image for correction.
(2)
The information processing apparatus according to (1), further comprising:
a correction unit configured to perform noise reduction correction on a processing-target image using the guide image.
(3)
The information processing apparatus according to (2), wherein
the correction unit performs outlier processing on the processing-target image before the noise reduction correction.
(4)
The information processing apparatus according to any one of (1) to (3), wherein
the guide image generation unit performs image processing after summing up the plurality of images and performing a division on the result.
(5)
The information processing apparatus according to any one of (1) to (3), wherein
the guide image generation unit performs image processing after summing up the plurality of images and before performing the division on the result.
(6)
The information processing apparatus according to any one of (1) to (3), wherein
the guide image generation unit performs processing of zeroing out a pixel equal to or less than a predetermined positive threshold on the plurality of images before summing up the plurality of images.
(7)
The information processing apparatus according to (6), wherein
the guide image generation unit performs the processing of zeroing out the pixel equal to or less than the predetermined positive threshold on the plurality of images, and performs image processing after summing up the plurality of images and performing the division on the result.
(8)
The information processing apparatus according to (6), wherein
the guide image generation unit performs the processing of zeroing out the pixel equal to or less than the predetermined positive threshold on the plurality of images, and performs image processing after summing up the plurality of images and before performing the division on the result.
(9)
The information processing apparatus according to any one of (1) to (3), wherein
the guide image generation unit sums up only images corresponding to a specific cell tumor out of the plurality of images.
(10)
The information processing apparatus according to (9), wherein
the guide image generation unit performs image processing after summing up only the images corresponding to the specific cell tumor out of the plurality of images and performing the division on the result.
(11)
The information processing apparatus according to (9), wherein
the guide image generation unit performs image processing after summing up only the images corresponding to the specific cell tumor out of the plurality of images and before performing the division on the result.
(12)
The information processing apparatus according to (9), wherein
the guide image generation unit performs zero-filling processing of zeroing out a pixel equal to or less than a predetermined positive threshold on the images corresponding to the specific cell tumor out of the plurality of images, and sums up only the images corresponding to the specific cell tumor after the zero-filling processing.
(13)
The information processing apparatus according to (12), wherein
the guide image generation unit performs image processing after summing up only the images corresponding to the specific cell tumor after the zero-filling processing and performing the division on the result.
(14)
The information processing apparatus according to (12), wherein
the guide image generation unit performs image processing after summing up only the images corresponding to the specific cell tumor after the zero-filling processing and before performing the division on the result.
(15)
The information processing apparatus according to any one of (1) to (14), wherein
the guide image generation unit sums up the plurality of images using an analysis result for a processing-target image as a weight.
(16)
The information processing apparatus according to (15), wherein
the guide image generation unit repeatedly sums up the plurality of images using the analysis result as the weight until the analysis result becomes comparable to an analysis result of a comparison target.
(17)
The information processing apparatus according to any one of (1) to (16), wherein
the plurality of images is each a color-separated image.
(18)
The information processing apparatus according to (4), (5), (7), (8), (10), (11), (13) or (14), wherein
the guide image generation unit performs the image processing using a noise removal filter and an edge enhancement filter.
(19)
A biological sample observation system comprising:
an image-capturing device configured to acquire a plurality of images each including spectral information regarding a biomarker; and
an information processing apparatus configured to process the plurality of images, wherein
the information processing apparatus includes
a guide image generation unit configured to sum up the plurality of images and perform a division on a result by a number of summed images to generate a guide image for correction.
(20)
An image generation method comprising:
summing up a plurality of images each including spectral information regarding a biomarker, and performing a division on a result by a number of summed images to generate a guide image for correction.
(21)
A biological sample observation system including the information processing apparatus according to any one of (1) to (18).
(22)
An image generation method of generating an image using the information processing apparatus according to any one of (1) to (18).
Number | Date | Country | Kind |
---|---|---|---|
2022-017079 | Feb 2022 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2023/002215 | 1/25/2023 | WO |