The disclosed technology relates to a method for acquiring an evaluation value for evaluating an object.
The following technologies are known as technologies related to a cell evaluation method using a digital holography technology. For example, WO2019/176427A describes a determination method for generating a phase difference image of a cell from a hologram obtained by capturing the cells which are an aggregate of a plurality of cells and determining a state of a cell based on the phase difference image and a shape index value corresponding to a shape of the cell.
JP2012-531584A describes a method including a step of reconstructing phase and/or amplitude information of an object wave front from interference fringes formed by superimposing object light and reference light, and a step of measuring a parameter indicating quality of an embryo or an egg from the phase and/or amplitude information.
A phase image generated based on an interference image formed by interference between object light transmitted through a cell and reference light coherent to the object light is an image showing a phase distribution of the object light transmitted through the cell, and a state of the cell is reflected. Accordingly, quality of the cell can be evaluated based on the phase image. For example, a total phase amount obtained by integrating and accumulating phase amounts for pixels of the phase image can be used as an evaluation value of the cell.
However, it has been found that in a case where the cell is a non-spherical object, the total phase amount changes depending on an orientation (alignment angle) of the cell in the phase image.
Since the orientation (alignment angle) of the cell in the phase image is random, a variation in the total phase amount becomes large in a case where the total phase amount changes in accordance with the alignment angle. As a result, accuracy of the quality evaluation of the cell based on the total phase amount deteriorates. In a case where the total phase amount is used for the evaluation of the cell, it is preferable that the total phase amount derived for an identical cell is a constant value independent of the alignment angle.
The disclosed technology has been made in view of the above points, and an object thereof is to suppress an influence caused by an orientation of an object to be evaluated on an evaluation value derived based on a phase image of the object.
A method for acquiring an evaluation value according to the disclosed technology includes generating a phase image showing a phase distribution of light transmitted through an object, deriving an evaluation value of the object based on the phase image, and correcting the evaluation value by using a correction coefficient determined in accordance with an orientation of the object in the phase image.
The phase image may be generated based on an interference image formed by interference between object light transmitted through the object and reference light coherent to the object light. The evaluation value may be a total phase amount obtained by integrating and accumulating phase amounts for pixels of the phase image.
A virtual object simulating a shape, a dimension, and a refractive index of the object may be created based on the phase image, and the correction coefficient may be derived by using the virtual object. A virtual phase image showing a phase distribution of light transmitted through the virtual object may be generated, a virtual total phase amount obtained by integrating and accumulating phase amounts for pixels of the virtual phase image may be derived, a standard total phase amount obtained by multiplying a volume of the virtual object and a refractive index of the virtual object may be derived, and a ratio between the virtual total phase amount and the standard total phase amount may be derived as the correction coefficient.
The correction coefficient may be derived for each of cases where orientations of the virtual object are different, and a correction value of the evaluation value may be acquired by multiplying the evaluation value by the correction coefficient corresponding to the orientation of the object in the phase image used in a case where the evaluation value is derived.
The object may be a cell. The object may be an embryonic cell in a two-cell stage, and the virtual object may have a three-dimensional structure in which two ellipsoids are connected.
According to the disclosed technology, the influence caused by the orientation of the object to be evaluated on the evaluation value derived based on the phase image of the object can be suppressed.
Exemplary embodiments according to the technique of the present disclosure will be described in detail based on the following figures, wherein:
Hereinafter, an example of an embodiment of the disclosed technology will be described with reference to the drawings. It should be noted that, the same or equivalent components and portions in the drawings are assigned by the same reference numerals, and the overlapping description will be omitted.
A method for acquiring an evaluation value according to an embodiment of the disclosed technology includes generating a phase image showing a phase distribution of light transmitted through an object to be evaluated, deriving an evaluation value of the object based on the phase image, and correcting the evaluation value by using a correction coefficient determined in accordance with an orientation of the object in the phase image.
(Acquisition of Interference Image: Step S1)
A method for acquiring the interference image will be described below.
The holography apparatus 10 includes a splitter 21, reflection mirrors 22 and 24, an objective lens 23, an imaging lens 25, a combiner 26, and an imaging apparatus 30. A cell 60 to be evaluated is disposed between the reflection mirror 22 and the objective lens 23 in a state where the cell is accommodated in a container 61 together with a culture solution.
For example, a HeNe laser having a wavelength of 632.8 nm can be used as a laser light source 20. Laser light L0 which linearly polarized light emitted from the laser light source 20 is divided into two laser light rays by the splitter 21. One of the two laser light rays is object light L1 and the other is reference light L2. A beam splitter can be used as the splitter 21. The object light L1 is incident on the reflection mirror 22. The cells 60 are irradiated with the object light L1 of which a traveling direction is bent by the reflection mirror 22.
An image due to the object light L1 transmitted through the cells 60 is magnified by the objective lens 23. The object light L1 transmitted through the objective lens 23 is bent in the traveling direction by the reflection mirror 24 and is incident on the combiner 26 through the imaging lens 25. On the other hand, the reference light L2 is also incident on the combiner 26. The object light L1 and the reference light L2 are combined by the combiner 26 and are imaged on an imaging surface of the imaging apparatus 30. A beam splitter can be used as the combiner 26.
The interference image (hologram) generated by the interference between the object light L1 and the reference light L2 is imaged by the imaging apparatus 30. The imaging apparatus 30 comprises an imaging element such as a complementary metal-oxide-semiconductor (CMOS) image sensor, and generates image data of the interference image.
(Generation of Phase Image: Step S2)
An example of a method for acquiring the phase image from the interference image will be described below. First, an interference image (hologram) of the cell 60 acquired by the imaging apparatus 30 is trimmed to have a size of, for example, 2048×2048, and is then subjected to a two-dimensional Fourier transform. A Fourier-transformed image obtained by this processing may include an image based on direct light, object light, and conjugated light.
Subsequently, a position of the object light is specified by specifying a deviation amount of the object light with respect to the direct light in the Fourier-transformed image, and a complex amplitude component of only the object light is extracted by frequency filtering processing using a mask of a circular opening having a radius of 250 pixels.
Subsequently, for example, an angular spectrum method is applied to restore the image showing the phase of the cell 60 at any spatial position. Specifically, an angular spectrum U(fx, fy; 0) of the Fourier-transformed image of a wave front u(x, y; 0) captured by the imaging surface of the imaging apparatus 30 is obtained. Subsequently, as represented in Equation (1) below, the angular spectrum U (fx, fy; 0) is multiplied by a transfer function H(fx, fy; z), and thus, a wave front at any position z in an optical axis direction (z direction) is reproduced. Here, the transfer function H(fx, fy; z) is a frequency response function (Fourier transform of an impulse response function (green function)).
Subsequently, as represented in Equation (2) below, an inverse Fourier transform is performed on a wave front U(fx, fy; z) at the position z in the optical axis direction (z direction), and thus, a solution u(x, y; z) at the position z is derived.
Subsequently, the phase image is generated by deriving a phase y for u(x, y; z) as represented in Equation (3) below.
A phase in the phase image before unwrapping obtained by the above processing is wrapped to a value of 0 to 2π. Therefore, for example, portions of 2π or more are joined by applying a phase connection (unwrapping) method such as unweighted least squares or Flynn's algorithm, and thus, a final phase image can be obtained. It should be noted that, many unwrapping methods have been proposed, and an appropriate method that does not cause phase mismatch may be appropriately selected.
Here, in a case where a phase of a background (region where the cell 60 is not present) present in the same focal plane of the phase image IP is PB and a phase of a region where the cell 60 is present is PS, a phase amount P in the phase image IP is represented by Equation (4) below. In addition, the term “phase” in the present specification is a phase of an electric field amplitude in a case where light is regarded as an electromagnetic wave, and is used in a more general sense.
P=P
S
−P
B (4)
In addition, a phase amount Pk at each pixel k of the phase image IP can be represented by Equation (5) below. Here, nk is a difference between a refractive index of the cell 60 at a portion corresponding to each pixel k of the phase image IP and a refractive index of the surroundings of the cell, dk is a thickness of the cell 60 at the portion corresponding to each pixel k of the phase image IP, and λ, is a wavelength of the object light in a hologram optical system.
(Derivation of Evaluation Value: Step S3)
The phase image of the cell 60 is an image showing a phase distribution of the object light L1 transmitted through the cell 60, and is also an image showing an optical path length distribution of the object light transmitted through the cell 60. Since an optical path length in the cell 60 corresponds to a product of a difference in the refractive index between the cell 60 and the surroundings of the cell the thickness of the cell 60, the phase image of the cell 60 includes information on the refractive index and the thickness (shape) of the cell 60 as represented in Equation (5). Since the state of the cell 60 is reflected in the phase image of the cell 60, the quality of the cell 60 based on the phase image can be evaluated. Specifically, a total phase amount PA can be used as the evaluation value of the cell 60.
The total phase amount PA is represented by Equation (6) below. Here, s is an area of each pixel k of the phase image, and vk is a volume of the cell 60 at the portion corresponding to each pixel k of the phase image. As represented in Equation (6), the total phase amount P A corresponds to an amount obtained by integrating and accumulating the phase amounts Pk for the pixels of the phase image of the cell 60 for all the pixels k. The pixel value of the phase image corresponds to the phase amount Pk.
(Deviation of Correction Coefficient: Step S4)
As illustrated in
n=P
Q1
/D
L (7)
In step S12, a virtual phase image showing a phase distribution of light transmitted through the virtual object created in step S11 is generated. Specifically, an interference image is generated for the virtual object by using a numerical calculation method for calculating optical propagation, such as a finite-difference time-domain method (FDTD method). That is, the interference image of the virtual object is generated on a computer. Thereafter, the phase image is generated by performing the same processing as in step S2 on the interference image of the virtual object generated on the computer. The phase image generated for the virtual object is referred to as a virtual phase image in the present specification. It should be noted that, it is preferable that a focus position in a case where the virtual phase image is generated is determined by the same method as in a case where a focus position is determined in a case where the phase image is generated for the object to be evaluated. For example, a position where a variance of an amplitude image that can be generated from the interference image is minimized can be determined as the focus position.
The virtual phase image is generated for each of cases where an orientation (alignment angle) of the virtual object is changed.
In step S13, a total phase amount is derived by performing the same processing as in step S3 on the virtual phase image generated in step S12. That is, the total phase amount is derived by applying Equation (6) to the virtual phase image. The total phase amount derived for the virtual phase image is referred to as a virtual total phase amount PAV in the present specification. The virtual total phase amount PAV is derived for each of the cases where the orientation (alignment angle) of the virtual object is changed.
In step S14, a standard total phase amount PAS is derived for the virtual object. The standard total phase amount PAS is a standard value of the total phase amount in the virtual object and is represented by Equation (8) below. In Equation (8), n is a difference in refractive index between the virtual object and the surroundings, and V is a volume of the virtual object. The standard total phase amount PAS is constant regardless of the orientation (alignment angle) of the virtual object.
P
AS
=n×V (8)
In step S15, a ratio of the virtual total phase amount PAV derived in step S13 to the standard total phase amount PAS derived in step S14 is derived as a correction coefficient C. That is, the correction coefficient C is expressed by Equation (9) below. The correction coefficient C is derived for each of the cases where the orientation (alignment angle) of the virtual object is changed by using the virtual total phase amount PAV derived for each of the cases where the orientation (alignment angle) of the virtual object is changed.
C=P
AS
/P
AV (9)
(Correction of Evaluation Value: Step S5)
The total phase amount PA as the evaluation value derived in step S3 is corrected by using the correction coefficients C derived in step S4. Specifically, the orientation (alignment angle) of the object to be evaluated is specified in the phase image used in a case where the total phase amount PA is derived for the object.
Subsequently, the correction coefficient C corresponding to the specified orientation (alignment angle) of the object is extracted from among the correction coefficients C derived in step S4. Then, a correction value PX is acquired by multiplying the extracted correction coefficient C by the total phase amount PA as the evaluation value derived in step S3. That is, the correction value of the total phase amount PA is expressed by Equation (10) below.
P
X
=C×P
A (10)
As described above, according to the method for acquiring an evaluation value according to the disclosed technology, an influence caused by the orientation of the object on the evaluation value derived based on the phase image of the object to be evaluated can be suppressed. Accordingly, the evaluation value can be stably acquired, and the variation in the evaluation value can be suppressed.
It should be noted that, in the present embodiment, a case where the evaluation target is the cell has been illustrated, but the disclosed technology is not limited to this aspect. The disclosed technology is not limited to the cell, and any object including an industrial product having transparency to the object light can be evaluated. The disclosed technology is particularly effective in a case where the shape of the object to be evaluated is a non-spherical object. In addition, in the present embodiment, the total phase amount has been described as the evaluation value for evaluating the state of the object, but the disposed technology is not limited to this aspect. For example, a phase density obtained by dividing the total phase amount by the volume of the object, an average phase amount which is an average value of pixel values in the phase image, a maximum phase amount which is a maximum value of the pixel value in the phase image, a variance of the pixel values in the phase image, and the like can also be used as the evaluation value.
It should be noted that, the disclosure of Japanese Patent Application No. 2021-072083 filed on Apr. 21, 2021 is incorporated herein by reference in its entirety. In addition, all documents, patent applications, and technical standards described in the specification are incorporated herein by references to the same extent as the incorporation of the individual documents, patent applications, and technical standards by references are described specifically and individually.
Number | Date | Country | Kind |
---|---|---|---|
2021-072083 | Apr 2021 | JP | national |
This application is a continuation application of International Application No. PCT/JP2022/014972, filed on Mar. 28, 2022, the disclosure of which is incorporated herein by reference in its entirety. Further, this application claims priority from Japanese Patent Application No. 2021-072083, filed on Apr. 21, 2021, the disclosure of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/014972 | Mar 2022 | US |
Child | 18479832 | US |