The present disclosure relates to a microscopy system, a microscopy method, and a computer readable recording medium.
In observation of a subject having a height difference by use of an industrial microscope, or in observation of a subject having a thickness, such as a cell nucleus or a stem cell, by use of a biological microscope having a focal depth of several ten micrometers; there is a user need for quick identification of a region of interest present in a depth direction (Z direction) along the optical axis of the observation optical system. For this need, there is a method, in which plural images having different focal planes are acquired by sequential imaging being performed while the focal plane of the observation optical system is shifted along the optical axis, and these plural images are displayed by being arranged three-dimensionally. By this three-dimensional display, three-dimensional information is able to be observed from an arbitrary direction, and the position of a region of interest is able to be checked. The plural images with the different focal planes acquired as described above are collectively called Z-stack images.
However, when plural focused structures are present in the depth direction like in a transparent subject or a subject having a height difference, by the conventionally three-dimensional display, a state, in which another structure present more shallowly in the observation direction or a blur due to an out-of-focus position hides a structure that is in the background (occlusion), is caused, and thus there is a region where its structure is unable to be directly (intuitively) seen. When occlusion is caused, an image, which has been captured at a position where a structure desired to be observed is present, needs to be searched for and checked two-dimensionally. There is also a method, in which a three-dimensional position is checked by display of only a structure having the maximum luminance in the depth direction, but this method has a problem that structures not having the maximum luminance are not reproduced.
Therefore, a method, in which a search is performed in the depth direction after presence and X-Y positions of two-dimensionally overlapping structures are checked by generation of an all-focused image from Z-stack images, may be used. Methods of generating an all-focused image include: a method of reconstructing a multi-focused image synthesized by superimposition of Z-stack images; and a method of extracting a focused area in each of Z-stack images and synthesizing the focused areas. Such an all-focused image is useful for screening of plural structural arrangements present in the depth direction.
For example, in Japanese Patent Application Laid-open No. 2014-021490, a method, in which an area desired to be observed is selected by use of a user interface, from an all-focused image generated from Z-stack images, and Z-stack images that are in focus are extracted and displayed based on in-focus-ness, is disclosed.
Further, in Japanese Patent Application Laid-open No. 2014-021489, a method, in which in-focus-ness is calculated in Z-stack images, extraction candidates are selected based on in-focus-ness in the depth direction, weighting according to the in-focus-ness is executed, and synthesis is performed, is disclosed. According to Japanese Patent Application Laid-open No. 2014-021489, a depth map is able to be generated based on a peak of in-focus-ness at each X-Y position, and a Z-position of a peak of in-focus-ness is able to be known from this depth map.
Further, in International Publication WO No. 2011/158498, a technique, in which two images respectively focused at a near end side and a far end side of a subject, and an all-focused image generated by imaging being executed while an image sensor is swept from the near end side to the far end side of the subject, are acquired, the images respectively focused at the near end side and the far end side are reconstructed from the all-focused image, thereby an amount of blurring in a partial area in an image is calculated, and thereby a distance from the optical system to the subject is acquired and a distance map is generated, is disclosed.
A microscopy system according to one aspect of the present disclosure includes: an imaging unit configured to capture a subject image generated by an observation optical system of a microscope, and acquire an image; a shifting unit configured to shift positions of a focal plane and a field of view of the observation optical system; an imaging control unit configured to cause the imaging unit to acquire a multi-focused image including image information in plural planes in an optical axis direction of the observation optical system, by shifting the positions of the focal plane and the field of view in one exposure period of the imaging unit; a shift amount acquisition processing unit configured to acquire a shift amount, by which the position of the field of view is shifted; an all-focused image generating unit configured to respectively generate plural all-focused images, based on plural multi-focused images respectively acquired under plural conditions where the shift amounts differ, and on blurring information of images according to the shift amounts; and a display unit configured to consecutively display the plural all-focused images generated by the all-focused image generating unit.
The above and other features, advantages and technical and industrial significance of this disclosure will be better understood by reading the following detailed description of presently preferred embodiments of the disclosure, when considered in connection with the accompanying drawings.
Hereinafter, embodiments of a microscopy system, a microscopy method, and a microscopy program, according to the present disclosure, will be described in detail, while reference is made to the drawings. The present disclosure is not limited by these embodiments. Further, the same reference signs are used for reference to the same portions, throughout the drawings.
The trinocular lens barrel unit 101 causes the observation light entering from the objective lens 140 to be branched to the eyepiece unit 103 for direct observation of the subject S by a user, and to the imaging unit 211 described later.
The epi-illumination unit 110 includes an epi-illumination light source 111 and an epi-illumination optical system 112, and irradiates the subject S with epi-illumination light. The epi-illumination optical system 112 includes various optical members that condense illumination light emitted from the epi-illumination light source 111, and guide the condensed illumination light in a direction of an optical axis L of the observation optical system 104, the various optical members including, specifically, a filter unit, a shutter, a field stop, and an aperture diaphragm.
The transmitting illumination unit 120 includes a transmitting illumination light source 121 and transmitting illumination optical system 122, and irradiates the subject S with transmitting illumination light. The transmitting illumination optical system 122 includes various optical members that condense illumination light emitted from the transmitting illumination light source 121, and guide the condensed illumination light in the direction of the optical axis L, the various optical members including, specifically, a filter unit, a shutter, a field stop, and an aperture diaphragm.
Either of these epi-illumination unit 110 and transmitting illumination unit 120 is selected and used, according to microscopy. Only one of the epi-illumination unit 110 and the transmitting illumination unit 120 may be provided in the microscope device 10.
The electrically driven stage unit 130 includes the stage 131, a stage drive unit 132 that moves the stage 131, and a position detecting unit 133. The stage drive unit 132 is formed of, for example, a motor. A subject placement surface 131a of the stage 131 is provided to be orthogonal to the optical axis of the objective lens 140. Hereinafter, the subject placement surface 131a will be referred to as an X-Y plane, and a normal direction of the X-Y plane, that is, a direction parallel to the optical axis, will be referred to as a Z direction. Along the Z direction, a downward direction in the figure, that is, a direction, in which the stage 131 (subject placement surface 131a) goes away from the objective lens 140, will be referred to as a plus direction.
By the stage 131 being moved in the X-Y plane, a position of a field of view of the objective lens 140 is able to be shifted. Further, by the stage 131 being moved in the Z direction, a focal plane of the objective lens 140 is able to be shifted along the optical axis L. That is, the electrically driven stage unit 130 is a shifting means that shifts positions of the focal plane and the field of view by moving the stage 131 under control by an imaging control unit 22 described later.
In
The position detecting unit 133 is formed of an encoder that detects an amount of rotation of the stage drive unit 132 formed of, for example, a motor, and detects a position of the stage 131 and outputs a detection signal. Instead of the stage drive unit 132 and the position detecting unit 133, a pulse generating unit, which generates a pulse according to control by the imaging control unit 22 described later, and a stepping motor, may be provided.
The objective lens 140 is installed in a revolver 142 that is able to hold plural objective lenses (for example, the objective lens 140 and an objective lens 141) having magnifications different from one another. By this revolver 142 being rotated for change between the objective lenses 140 and 141 opposing the stage 131, imaging magnifications are able to be changed.
With reference to
The image acquiring unit 21 includes the imaging unit 211 and a memory 212. The imaging unit 211 includes an imager 211a formed of, for example, a CCD or a CMOS, and is formed by use of a camera that is able to capture a color image having a pixel level (pixel value) in each of R (red), G (green), and B (blue) bands at each pixel included in the imager 211a. Or, the imaging unit 211 may be formed by use of a camera that is able to capture a monochrome image, through which a luminance value Y is output as a pixel level (pixel value) at each pixel.
As illustrated in
The memory 212 is formed of a recording device, such as, for example, an updatingly recordable flash memory, or a semiconductor memory, like a RAM or a ROM, and temporarily stores therein the image data generated by the imaging unit 211.
By outputting a control signal to the microscope device 10 and moving the stage 131 in one exposure period of the imaging unit 211, the imaging control unit 22 shifts the positions of the focal plane and the field of view of the objective lens 140, and thereby executes control of causing a multi-focused image to be acquired, the multi-focused image including image information in plural planes, which are in the direction of the optical axis L of the observation optical system 104.
The control unit 23 is formed of hardware, such as, for example, a CPU, and integrally controls operation of the imaging device 20 and the whole microscopy system 1, based on various parameters stored in the storage unit 24 and information or the like input from the input unit 25, by reading a program stored in the storage unit 24. Further, the control unit 23 executes processing for generation of an all-focused image by performing predetermined image processing on image data input from the image acquiring unit 21, and executes control of causing a display device 30 to display thereon the generated all-focused image.
In detail, the control unit 23 includes: a shift amount acquisition processing unit 231 that acquires a shift amount, by which the position of the field of view of the observation optical system 104 is shifted when a multi-focused image is acquired; and an all-focused image generating unit 232 that generates an all-focused image by reconstructing a multi-focused image by using a point spread function (PSF) representing blurring in an image.
The storage unit 24 is formed of: a recording device, such as an updatingly recordable flash memory, or a semiconductor memory like a RAM, or a ROM; or a recording medium such as a hard disk, an MO, a CD-R, or a DVD-R, which is of a built-in type or is connected via a data communication terminal, and a writing and reading device that writes information into the recording medium and reads information recorded in the recording medium. The storage unit 24 includes: a parameter storage unit 241 that stores therein parameters used in calculation by the control unit 23; a setting information storage unit 242; and a program storage unit 243 that stores therein various programs. Among these, the parameter storage unit 241 stores therein a parameter, such as a shift amount, by which the position of the field of view is shifted when a multi-focused image is acquired. Further, the program storage unit 243 stores therein a control program for causing the imaging device 20 to execute predetermined operation, an image processing program, and the like.
The input unit 25 is formed of: an input device, such as a keyboard, various buttons, or various switches; a pointing device, such as a mouse or a touch panel; and the like, and the input unit 25 inputs signals according to operations performed on these devices, to the control unit 23.
The output unit 26 is an external interface that outputs an image based on image data acquired by the image acquiring unit 21, an all-focused image generated by the control unit 23, and various other types of information, to an external device, such as the display device 30, and causes the external device to display them in a predetermined format.
This imaging device 20 may be formed by combination of, for example, a general-purpose device, such as personal computer or a work station, with a general-purpose digital camera, via an external interface.
The display device 30 is formed of, for example, an LCD, an EL display, or a CRT display, and displays thereon an image and related information output from the output unit 26. In this first embodiment, the display device 30 is provided outside the imaging device 20, but the display device 30 may be provided inside the imaging device 20.
Next, operation of the microscopy system 1 will be described.
Firstly, at Steps S10 to S16, the image acquiring unit 21 acquires plural multi-focused images having different shift amounts.
It is assumed that at start of processing of Step S10, as illustrated in
At Step S10, the image acquiring unit 21 acquires a multi-focused image having a shift amount of zero (σ11=0). In detail, as illustrated in
Thereafter, at Step S11, the all-focused image generating unit 232 acquires point spread function (PSF) information representing image blurring in the image of each slice Fj, and generates a PSF image based on this PSF information. A point spread function is stored in the parameter storage unit 241 beforehand, in association with: an imaging condition, such as a magnification of the objective lens 140 in the microscope device 10; and a slice Fj. The all-focused image generating unit 232 reads, based on an imaging condition, such as a magnification of the objective lens 140, a point spread function corresponding to a slice Fj from the parameter storage unit 241, calculates, based on the point spread function, a pixel value corresponding to each pixel position in the image in the field of view V, and thereby generates a PSF image for each slice Fj.
At subsequent Step S12, the all-focused image generating unit 232 generates a multi-focused PSF image having a shift amount of zero, the multi-focused PSF image corresponding to the multi-focused image SI11. In detail, by calculation of an arithmetic mean of pixel values of pixels at corresponding positions among plural PSF images respectively corresponding to the slices Fj generated at Step S11, a pixel value of each pixel in the multi-focused PSF image is calculated.
At Step S13, the all-focused image generating unit 232 reconstructs the multi-focused image SI11 generated at Step S10 by using the multi-focused PSF image. Thereby, an all-focused image is generated from the multi-focused PSF image having a shift amount of zero.
After the all-focused image having a shift amount of zero is generated at Steps S10 to S13, the shift amount acquisition processing unit 231 acquires shift amounts σi (i=12, . . . , n) to be used in acquisition of plural multi-focused images (Step S14). The subscript, “i”, is a variable indicating acquisition order of the multi-focused images. As illustrated in
At subsequent Step S15, the imaging control unit 22 sets imaging parameters based on the shift amount σi acquired at Step S14. Specifically, firstly, the imaging control unit 22 calculates, as the imaging parameters: an imaging start position; and a movement distance and a shifting velocity, by and at which the field of view V is moved to that imaging start position.
The imaging control unit 22 firstly calculates a movement distance ti=σi×N, by which the field of view is moved to the next imaging start position. Herein, N represents the number of the depths of field Δz included in the thickness D of the superimposed imaging range. In this microscope device 10, the movement distance corresponds to movement of the stage 131 along an X direction by a distance σi×N×p/M, by use of a pixel pitch p of the imaging unit 211 and an observation magnification of M times.
Further, as the imaging parameter, the imaging control unit 22 calculates a shift velocity vi, at which the field of view V is shifted along the X direction in one exposure period. The shift velocity vi is given by the following Equation (1), by use of a period Ti of exposure of one time, the pixel pitch p of the imaging unit 211, the number N, and the observation magnification of M times.
v
i=(p×σi/M)/(Ti/N) (1)
At subsequent Step S16, under control by the imaging control unit 22 based on the imaging parameters set at Step S15, the image acquiring unit 21 performs imaging of the subject S while shifting the positions of the focal plane and the field of view V of the observation optical system 104 in one exposure period of the imaging unit 211, and thereby acquires a multi-focused image having the shift amount σi.
As illustrated in
The directions in which the positions of the focal plane and the field of view V are shifted are not limited to the directions of the arrows illustrated in
At Step S17 subsequent to Step S16, the all-focused image generating unit 232 generates an all-focused image, based on the plural multi-focused images acquired at Step S16.
At subsequent Step S172, by using the plural PSF images that have been subjected to the shifting processing at Step S171, the all-focused image generating unit 232 generates a multi-focused PSF image PIi having the shift amount σi. In detail, by calculation of an arithmetic mean of pixel values of pixels at corresponding positions among the plural PSF images that have been subjected to the shifting processing, a pixel value of each pixel in the multi-focused PSF image PIi is calculated.
At Step S173, the all-focused image generating unit 232 reconstructs the multi-focused image SIi acquired at Step S16, by using the multi-focused PSF image PIi. Thereby, an all-focused image AIi is generated from the multi-focused image SIi. Thereafter, the operation of the control unit 23 returns to the main routine.
At Step S18 subsequent to Step S17, the imaging device 20 outputs image data of the all-focused image generated at Step S17, to the display device 30, and causes the display device 30 to display this all-focused image. The all-focused image having a shift amount of zero generated at Step S13 may be displayed after Step S13, or may be displayed at Step S18.
At Step S19 subsequent to Step S18, the control unit 23 determines whether or not the variable i has reached a maximum value n. If the variable i has not reached the maximum value n (Step S19: No), the control unit 23 increments the variable i (Step S20). Thereafter, the operation of the control unit 23 returns to Step S14. By repetition of Steps S14 to S18 described above, the plural multi-focused images SIi having different shift amounts σi are acquired, and are sequentially displayed on the display device 30. Further, if the variable i has reached the maximum value n (Step S19: Yes), the control unit 23 ends the operation for generation and display of all-focused images.
By appropriate setting of the imaging parameters controlling the acquisition order of the multi-focused images SIi, the imaging start positions, and the shift directions of the positions of the focal plane and the field of view V; the multi-focused images SIi are able to be efficiently acquired with the amount of movement of the stage 131 being kept small and the total imaging time being shortened.
Specifically, firstly, as illustrated in
The imaging range in the Z-direction, the non-shifted Z-position, the shift amount, and the number of times of shifting may be made settable by a user.
For example, for setting of the imaging range in the Z direction, any one of large, medium, and small depths of field may be set, for example. For setting of the non-shifted Z-position, for example, whether a fixed position in the Z-direction is to be set at a near side or a far side may be selected. For setting of the shift amount, for example, any one of large, medium, and small shift amounts may be selected. For setting of the number of times of shifting, for example, any one of 20, 30, and 40 may be selected as the number of times of shifting.
As described above, in the first embodiment, since the magnitude of the shift amount σi is changed among the plural multi-focused images SIi; a state where the subject S is virtually observed from plural directions of a wider range is able to be reproduced and displayed as all-focused images consecutively on the display device 30. Therefore, a user is able to grasp the Z direction position of a structure in the subject S, how the structures overlap one another, and the front and back relations among the structures, intuitively and more realistically.
Further, according to the above described first embodiment, since a multi-focused image is acquired by imaging being performed with the focal plane being moved in one exposure period: as compared to a case, in which a multi-focused image is acquired by acquisition of Z-stack images acquired through a plural number of times of imaging, and calculation of arithmetic means of these Z-stack images; imaging is able to be executed in a short period of time, and the amount of data and the amount of calculation in image processing are able to be reduced significantly.
These shift amounts σi may be preset amounts, or may be acquired based on information input through the input unit 25 according to user operations. When a shift amount σi is determined according to a user operation, an angle of inclination of a line of sight of the user with respect to the upright direction of the subject S may be input. In this case, when a pixel pitch of the imaging unit 211 is p (μm/pixel) and the input angle is θi, the shift amount σi (pixels) is given by the following Equation (2) by use of the pixel pitch p of the imaging unit 211 and the distance Z (approximate value) from the objective lens 140 to each depth in the subject S.
σi=(Z/tan θi)/p (2)
In Equation (2), the distance Z may be approximated by the distance from the objective lens 140 to each depth in the subject S.
In the above described first embodiment, in the processing for acquisition of plural multi-focused images having different shift amounts, the processing for the shift amount of zero (σ11=0) is executed first, but the image having the shift amount of zero may be acquired in the middle of the sequential change of the shift amount. In the above described first embodiment, the shift amounts include zero (σ11=0) (Steps S10 to S13), but not being limited thereto, the shift amounts may not include zero. In that case, by the processing from Step S14 to Step S20 being executed, all-focused images are able to be generated and displayed. Therefore, even if the shift amounts do not include zero (σ≠0), the above described effects are able to be achieved.
Next, a first modified example of the first embodiment will be described.
In the above described first embodiment, the optical axis of the observation optical system 104 is orthogonal to the stage 131, and when the multi-focused image SIi having the shift amount σi is acquired, imaging is executed while the stage 131 is moved in the Z direction and X direction. However, imaging may be executed with the optical axis of the observation optical system 104 being made inclined with respect to the stage 131 beforehand.
For example, as illustrated in
Or, as illustrated in
As described above, when the focal plane Pf is inclined with respect to the subject S, various imaging parameters are set as described below. When a shift amount between adjacent slices in the multi-focused image SIi is σi (pixels), the pixel pitch of the imaging unit 211 is p (μm/pixel), the number of depths of field Δz included in a superimposed imaging range of the thickness D is N (N=D/Δz), and the observation magnification is M times; an angle αi is given by the following Equation (3).
αi=tan−1{(p×σi×N/M)/D} (3)
The shift amount acquisition processing unit 231 calculates and outputs the angle αi based on the shift amount σi. Based on this angle αi, the imaging control unit 22 executes, as illustrated in
Or, if the base 161 is installed as illustrated in
Next, a second modified example of the first embodiment according to the present disclosure will be described. In the above described first embodiment, by the positions of the focal plane and the field of view V being continuously shifted with the shutter being released in one exposure period of the imaging unit 211, a multi-focused image is acquired. However, a shutter that blocks incidence of light on the imaging unit 211 may be opened and shut at a predetermined cycle, and the positions of the focal plane and the field of view V may be shifted stepwise while the shutter is shut.
The number of times the shutter is opened and closed in one exposure period, that is, the number of times the subject S is exposed to the imaging unit 211, or the number of times the positions of the focal plane and the field of view V are shifted, and the shift amounts of the positions of the focal plane and the field of view V per one time are set as appropriate according to the exposure period of one time in the imaging unit 211, the shutter speed, and the like.
For example, when the multi-focused image SI11 having the shift amount of zero illustrated in
In this case, at Step S11 in
Next, a third modified example of the first embodiment will be described. In the above described first embodiment, for promotion of understanding, the case where the field of view V of the observation optical system 104 is shifted only in the X direction has been described, but similar processing may be executed for the Y direction and X-Y direction. In this case, an all-focused image corresponding to a case, in which a virtual viewpoint with respect to the subject S is moved along the Y direction or X-Y direction, is generated. For the X-Y direction, all-focused images are generated respectively along two orthogonal directions. Further, by the field of view V of the observation optical system 104 being shifted in the four directions, which are the X direction, the Y direction, and the X-Y direction, an all-focused image corresponding to a case, in which a virtual viewpoint with respect to the subject S is moved in a horizontal plane, is also able to be generated.
Further,
Further,
The display device 30 sequentially (consecutively) displays, in the same display area, in addition to the plural all-focused images AI0, AI+1, AI+2, AI−1, and AI−2 generated, the all-focused images AI+21, AI+22, AI−21, AI−22, AI+31, AI+32, AI−31, AI−32, AI+41, AI+42, AI−41, and AI−42, in chronological order. The shift in the X-Y direction is able to be realized by movement or rotation of the lens barrel 102 and the stage 131.
As described above, in this third modified example, since the magnitude of the shift amount σi is changed among the plural multi-focused images SIi, the state where the subject S is virtually observed from plural directions of a wider range is able to be reproduced and displayed as all-focused images consecutively on the display device 30. Therefore, a user is able to grasp the Z direction position of a structure in the subject S, how the structures overlap one another, and the front and back relations among the structures, intuitively and more realistically.
Next, a fourth modified example of the first embodiment will be described. In the above described first embodiment, the case where the field of view V of the observation optical system 104 is shifted in a predetermined direction has been described, but a direction of the shift may be specified by a user.
In this fourth modified example, based on a direction specified by a user from an arbitrary all-focused image, all-focused images in plural viewpoint directions are displayed consecutively. As a method for the specification of a direction, any means may be used, such as mouse movement, a line of sight, movement of a finger touching a screen, or a spatial user operation (for example, with a finger, a hand, or the like), as long as a desired direction of movement is able to be identified.
When setting of a selection of an all-focused image to be displayed is input by a user, an arbitrary all-focused image (for example, an all-focused image having a shift amount of zero) is displayed, under control by the control unit 23. By operating a mouse or the like, the user moves a pointer in a direction in which shifting is desired to be executed, via the input unit 25. Specifically, if a direction is determined from the movement of the pointer, movement vectors Vx and Vy are acquired from a locus L1 of movement of a pointer P1 to a position of a pointer P2 on the screen through operation on the mouse, and are determined as a shift direction of the position of the X-Y plane. For the actually measured Vx and Vy (pixels), from a ratio between these Vx and Vy, a direction component (inclination component) is calculated, the direction of this component is determined as the shift direction, and an all-focused image is generated and displayed, from β multi-focused images captured as a result of imaging with the number of times of shifting being β times at the predetermined shift amount σi, similarly to the first embodiment.
Further, if the pointer P2 on the screen has moved to a position of a pointer P3 by operation on the mouse or the like, the movement vectors Vx and Vy are acquired from a locus L2 of the moved pointer, and are determined as a shift direction of the position of the X-Y plane.
Further, although display of an all-focused image based only on a movement direction of a pointer has been described, not being limited thereto, for example, the number of times of shifting β may be determined from the movement amounts of Vx and Vy and the predetermined shift amount σi.
According to this fourth modified example, by display of an all-focused image for a shift direction according to a user's operation, intuitive observation is able to be realized as if the user is moving the observed subject.
Next, a second embodiment will be described.
The imaging device 40 includes a control unit 41, instead of the control unit 23 illustrated in
The slice-of-interest acquiring unit 411 acquires a position of a slice in the Z direction, the slice including a structure in the subject S corresponding to an observed region input via the input unit 25 from the display device 50 described later, and determines this slice as a slice of interest.
The display device 50 is formed of, for example, an LCD, an EL display, or a CRT display, and includes: an image display unit 51 that displays thereon an image and related information output from the output unit 26; and an observed region determining unit 52 that determines, according to operation performed from outside, a region in an all-focused image displayed on the image display unit 51 as an observed region, and inputs a signal representing the observed region to the control unit 41.
Next, operation of the microscopy system 2 will be described.
If the variable i has reached the maximum value n at Step S19 (Step S19: Yes), the observed region determining unit 52 determines whether or not a user operation for selection of an arbitrary observed region has been performed on any of all-focused images AI51, AI52, AI53, and AI54 displayed on the image display unit 51 (Step S21).
If the user operation has not been performed (Step S21: No), the operation of the microscopy system 2 returns to Step S12.
On the contrary, if the user operation has been performed (Step S21: Yes), the observed region determining unit 52 determines the region selected by the user operation as an observed region, and inputs a signal representing the observed region, to the control unit 41 (Step S22).
A region desired to be selected may not be able to be checked from an all-focused image having a shift amount of zero, for example. In this case, the region desired to be selected by a user may be enabled to be checked by change of the display to an all-focused image acquired by shifting. In that case, all-focused images acquired as a result of shifting in respective directions are assumed to be stored already in the storage unit 24.
In a state where the arrow representing the −X direction and −Y direction has been selected by the pointer P5, when the selection of that arrow is input by clicking of the mouse or the like, the all-focused image having the shift amount of zero is able to be switched to an all-focused image acquired as a result of shifting by a predetermined amount in that direction. When the all-focused image, on which the observed region desired to be selected is able to be checked, is displayed by shifting of the all-focused image, as illustrated in
At subsequent Step S23, the control unit 41 acquires, based on information representing the observed region input from the observed region determining unit 52, Z-position information of the observed region.
At Step S231, the slice-of-interest acquiring unit 411 acquires X-Y position information of the observed region R54 in the all-focused image AI54.
At subsequent Step S232, the slice-of-interest acquiring unit 411 extracts regions R′51, R′52, and R′53 corresponding to the observed region R54, respectively from the all-focused images AI51, AI52, and AI53 other than the all-focused image AI54, and acquires X-Y position information of each of these regions. The regions R′51, R′52, and R′53 may be extracted by use of a publicly known image recognition technique, such as similarity computation by pattern matching or by use of the sum of absolute difference (SAD). Hereinafter, these areas R′51, R′52, and R′53 may also be referred to as observed regions.
At subsequent Step S233, the slice-of-interest acquiring unit 411 acquires shift amounts of the X-Y positions of the observed regions R′51, R′52, R′53, and R54 among the all-focused images AI51, AI52, AI53, AI54. In the case of
At subsequent Step S234, the slice-of-interest acquiring unit 411 acquires the slice Fj including the observed regions R′51, R′52, R′53, and R54, based on the shift amounts of the observed regions R′51, R′52, R′53, and R54.
When a shift amount in an all-focused image AIi is σi, a shift amount si,j of a position of the field of view V in each slice Fj from a position of the field of view V in the slice F0 of the uppermost plane is given by the following Equation (4).
s
i,j=σi×j (4)
Therefore, if a shift amount |s(i+1),j−si,j| among the observed regions R′51, R′52, R′53, and R54 is given, from the following Equation (5), the slice Fj including the observed regions R′51, R′52, R′53, and R54 is able to be identified.
|s(i+1),j−si,j|=σi+1×j−σi×j
j=|s
(i+1),j
−s
i,j|/(σi+1−σi) (5)
For example, as illustrated in
The slice-of-interest acquiring unit 411 outputs the slice Fj acquired as described above, as Z position information of the observed region. Thereafter, the operation of the control unit 41 returns to the main routine.
At Step S24 subsequent to Step S23, the control unit 41 acquires the observed region R54, based on the Z position information output by the slice-of-interest acquiring unit 411.
At subsequent Step S25, the imaging control unit 22 acquires a multi-focused image centering around the observed region R54. This multi-focused image is a multi-focused image resulting from shifting in at least one of the above described X direction, Y direction, and X-Y direction.
At subsequent Step S26, the all-focused image generating unit 232 generates an all-focused image, based on the acquired multi-focused image.
At subsequent Step S27, the control unit 41 causes the all-focused image generated, to be displayed on the image display unit 51 of the display device 50. Thereafter, the operation of the microscopy system 2 is ended.
According to the above described second embodiment, a user is able to intuitively and easily grasp the Z direction positions of structures that are seen to be overlapping on a plane, and front and back relations among the structures.
Next, a modified example of the second embodiment will be described. In the above described second embodiment, imaging is performed around the acquired observed region, without the superimposed imaging range being changed, but the superimposed imaging range may be changed around the observed region.
The superimposed imaging range D2 is, for example, by use of a depth of field (DOF) based on an observation magnification, a range of 5DOFs toward both a shallower side and a deeper side in a depth direction around a slice Fn including the observed region R55. Thereby, a multi-focused image having the slice Fn including the observed region R55, and slices Fn−1, Fn−2, Fn+1, and Fn+ at the shallower side and the deeper side in the depth direction is acquired.
According to this modified example, since an all-focused image limited to slices including and around the observed region in the depth direction is generated by the imaging range being limited, an all-focused image that is easy for a user to observe an observed region is able to be generated and displayed.
Next, a third embodiment will be described.
Specifically, as illustrated in
According to the above described third embodiment, when two observed regions are selected, all-focused images are consecutively displayed, the all-focused images having been shifted in a predetermined direction with a middle position of pieces of Z position information acquired from the two observed regions being the center of rotation, and thus, by the two observed regions being displayed as if the two observed regions exist at different positions, a user is able to easily grasp the two observed regions.
The present disclosure is not limited to the above described first to third embodiments and modified examples, as-is; and by combination of plural components disclosed in the respective embodiments and modified examples as appropriate, variations may be formed. For example, some components may be excluded from all of the components disclosed in the embodiments for the formation. Or, components disclosed in different ones of the embodiments may be combined as appropriate for the formation.
As described above, the present disclosure may include various embodiments and the like not described herein, and design changes and the like may be performed without departing from the technical ideas stated in the claims, as appropriate.
The present disclosure has an effect of: enabling a highly visible image, from which a user is able to visually and intuitively grasp a Z direction position of a structure captured in an image and a front and back relation between structures, to be generated in a shorter time period than conventionally done; and enabling the amount of data and the amount of calculation in image processing to be reduced more than conventionally done.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the disclosure in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
This application is a continuation of International Application No. PCT/JP2015/084448, filed on Dec. 8, 2015, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2015/084448 | Dec 2015 | US |
Child | 15996708 | US |