The present application claims foreign priority based on Japanese Patent Application No. 2016-144939, filed Jul. 22, 2016, the contents of which is incorporated herein by reference.
The present invention relates to a magnifying observation apparatus for magnifying and observing an observation target.
A magnifying observation apparatus is sometimes used in order to magnify and observe an observation target (see, for example, JP-A-2013-72971). In a microscope system described in JP-A-2013-72971, bright-field illumination light and dark-field illumination light are irradiated on a sample on a stage through an objective lens.
The bright-field illumination light is illumination light emitted in a direction parallel to an optical axis of the objective lens. The dark-field illumination light is illumination light emitted in a direction inclined with respect to the optical axis of the objective lens. Observation light reflected on the sample is made incident on an imaging apparatus through an imaging lens, whereby the sample is imaged.
In the microscope system described in JP-A-2013-72971, illumination intensity of one illumination light is relatively reduced according to a ratio of an exposure time for the bright-field illumination light and an exposure time for the dark-field illumination light. Consequently, the intensity of the bright-field illumination light and the intensity of the dark-field illumination light are aligned. JP-A-2013-72971 mentions that, as a result, it is possible to perform the observation of the sample at most suitable illumination intensity in observing the sample on which the bright-field illumination light and the dark-field illumination light are simultaneously irradiated. A user is capable of intuitively adjusting a ratio of illumination intensities of the bright-field illumination light and the dark-field illumination light from an image in an optimized illumination intensity state.
However, appropriate imaging conditions such as illumination intensity are different depending on the shape and the material of an observation target. Therefore, it is difficult for an unskilled user to acquire an image captured under the appropriate imaging conditions. It is sometimes found ex post that the imaging conditions are inappropriate. In such a case, imaging of the observation target needs to be performed again under different imaging conditions. Therefore, a burden on the user increases.
An object of the present invention is to provide a magnifying observation apparatus capable of easily acquiring an image of an observation target corresponding to a request of a user.
(1) A magnifying observation apparatus according to the present invention includes: a stage on which an observation target is placed; a light projecting device configured to selectively irradiate lights in a plurality of emitting directions different from one another on the observation target placed on the stage; an imaging section configured to receive light from the observation target and respectively generate a plurality of image data indicating images of the observation target at times when the lights in the plurality of emitting directions are respectively irradiated on the observation target; a designating section configured to designate an imaginary emitting direction of light on the basis of operation by a user; a data generating section configured to generate, on the basis of the emitting direction designated by the designating section and the plurality of image data generated by the imaging section, image data for display indicating an image of the observation target that should be obtained when it is assumed that the light in the designated emitting direction is irradiated on the observation target; and a display section configured to display an image for display based on the image data for display generated by the data generating section.
In the magnifying observation apparatus, the observation target is placed on the stage, and the lights in the plurality of emitting directions are respectively selectively irradiated on the observation target on the stage. The plurality of image data indicating the images of the observation target at the time when the lights in the plurality of emitting directions are respectively irradiated on the observation target are generated by the imaging section.
The imaginary emitting direction of light is designated on the basis of the operation by the user. In this case, the image data for display indicating the image of the observation target that should be obtained when it is assumed that the light in the designated emitting direction is irradiated on the observation target is generated on the basis of the designated emitting direction and the plurality of image data generated by the imaging section. The image based on the generated image data for display is displayed on the display section.
Therefore, by optionally designating the imaginary emitting direction of light, the user can generate, without changing an emitting direction of light actually irradiated on the observation target, image data for display indicating an image at the time when light in an appropriate emitting direction corresponding to the shape and the material of the observation target is irradiated on the observation target. Consequently, the user can easily acquire an image of the observation target corresponding to a request of the user.
The image data for display can be generated using an already generated plurality of image data. Therefore, it is unnecessary to perform the imaging of the observation target again. Therefore, it is possible to reduce a burden on the user.
(2) The data generating section may generate the image data for display by combining at least two or more image data among the plurality of image data generated by the imaging section.
In this case, at least two or more image data among a plurality of image data corresponding to the lights in the plurality of emitting directions different from one another are combined. Consequently, it is possible to easily generate image data for display indicating an image of the observation target that should be obtained when it is assumed that light in an emitting direction different from light actually irradiated on the observation target is irradiated on the observation target.
(3) The data generating section may determine rates of the combination of the at least two or more image data on the basis of the designated emitting direction. In this case, it is possible to more easily generate the image data for display on the basis of the plurality of image data.
(4) The data generating section may update, according to the emitting direction designated by the designating section, the image data for display to be generated. In this case, the user can easily acquire, by designating an optimum emitting direction corresponding to the shape and the material of the observation target, image data for display indicating an image of the observation target corresponding to a request of the user.
(5) The display section may display an image for display based on the image data for display before the update simultaneously with an image for display based on the image data for display after the update. In this case, the user can compare the image for display based on the image data for display before the update and the image for display based on the image data for display after the update.
(6) The designating section may include: an indicator display section configured to display a first indicator indicating an emitting position of light; and an operation section operated by the user in order to change a position of the first indicator displayed on the indicator display section. The imaginary emitting direction of light may be designated on the basis of the position of the first indicator on the indicator display section operated by the operation section. In this case, the user can easily designate, by moving the first indicator using the operation section, the imaginary emitting direction of light while grasping the emitting position of light.
(7) The indicator display section may further display a second indicator indicating a position of the observation target. In this case, the user can easily designate, by visually recognizing the first and second indicators, the imaginary emitting direction of light while grasping a relation between the observation target and the emitting position of light.
(8) The indicator display section may further display a third indicator indicating a range that can be designated as the imaginary emitting direction of light. In this case, the user can easily grasp, by visually recognizing the third indicator, the range that can be designated as the imaginary emitting direction of light.
(9) The indicator display section may configure a part of the display section, and the display section may include: a first display region where the image for display based on the image data for display is displayed; and a second display region where the first indicator is displayed. In this case, in the display section, the image for display is displayed in the first display region and the first indicator is displayed in the second display region. Consequently, the image for display and the first indicator do not overlap. Therefore, it is easy to visually recognize the image for display and the first indicator.
(10) The indicator display section may configure a part of the display section, and the display section may superimpose and display the first indicator on the image for display based on the image data for display. In this case, in the display section, the first indicator is superimposed and displayed on the image for display. Consequently, the user can easily designate the imaginary emitting direction of light with more intuitive operation.
(11) The magnifying observation apparatus may further include an objective lens, the imaging section may generate the plurality of image data by receiving light from the observation target via the objective lens, and the light projecting device may include: a first light projecting section including a plurality of light emission regions rotation-symmetrically disposed around an optical axis of the objective lens, the first light projecting section being provided to surround the optical axis of the objective lens; and a switching section configured to switch an emission state of light in the plurality of light emission regions such that the lights in the plurality of light emitting directions are irradiated on the observation target.
In this case, the emission state of light in the plurality of light emission regions of the first light projecting section is switched by the switching section. Consequently, it is possible to selectively irradiate, with a simple configuration and simple control, the lights in the plurality of emitting directions different from one another on the observation target.
(12) The light projecting device may further include a second light projecting section configured to irradiate first light on the observation target from a position closer to the optical axis of the objective lens than the first light projecting section, the switching section may further switch an emission state of the first light in the second light projecting section, the imaging section may further generate image data indicating an image of the observation target at a time when the first light is irradiated on the observation target, and the data generating section may generate the image data for display further on the basis of the image data generated according to the first light.
In this case, the first light is irradiated on the observation target from the position closer to the optical axis of the objective lens than the first light projecting section by the second light projecting section In the image of the observation target at the time when the first light is irradiated, unevenness on the surface of the observation target is more clearly shown compared with the image of the observation target at the time when the light is irradiated from the first light projecting section. Therefore, by using the image data corresponding to the first light, it is possible to generate the image data for display that more clearly shows the unevenness on the surface of the observation target.
(13) The light projecting device may further include a third light projecting section disposed to be opposed to the imaging section across the observation target and configured to irradiate second light on the observation target, the switching section may further switch an emission state of the second light in the third light projecting section, the imaging section may further generate image data indicating an image of the observation target at a time when the second light is irradiated on the observation target, and the data generating section may generate the image data for display further on the basis of the image data generated according to the second light.
In this case, the second light is irradiated on the observation target by the third light projecting section, and light transmitted through the observation target is received by the imaging section. In the image of the observation target at the time when the second light is irradiated, the structure on the inside of the observation target is shown. Therefore, by using the image data corresponding to the second light, it is possible to generate the image data for display that shows the structure on the inside of the observation target.
(14) The designating section may be configured to be capable of designating the imaginary emitting direction of light on the image for display displayed by the display section. Consequently, the user can easily designate the imaginary emitting direction of light.
(15) The display section may include: a main display region where the image for display based on the image data for display is displayed; and a sub-display region where a position image indicating a position of the observation target is displayed, the sub-display region being smaller than the main display region, emitting position indicators indicating emitting positions of lights may be respectively displayed in the main display region and the sub-display region, and, when the designation of the imaginary emitting direction of light is changed by the designating section, the emitting position indicators displayed in the main display region and the sub-display region may move in association with each other to move to positions corresponding to the changed emitting direction.
In this case, the emitting position of light indicated by the emitting position indicator displayed in the main display region and the emitting position of light indicated by the emitting position indicator displayed in the sub-display region are aligned. Therefore, the user can grasp the imaginary emitting direction of light while visually recognizing a desired region of the main display region and the sub-display region.
(16) The designating section may be configured to be capable of operating at least one of the emitting position indicator displayed in the main display region and the emitting position indicator displayed in the sub-display region. In this case, when one emitting position indicator moves according to the operation of the designating section by the user, the other emitting position indicator also moves in association with the movement of the one emitting position indicator.
According to the present invention, it is possible to easily acquire an image of the observation target corresponding to a request of the user.
A magnifying observation apparatus according to a first embodiment of the present invention is explained with reference to the drawings.
The stand section 110 has an L-shape in a longitudinal cross section and includes a setting section 111, a holding section 112, and a focus driving section 113. The setting section 111 and the holding section 112 are formed by, for example, resin. The setting section 111 has a horizontal flat shape and is set on a setting surface. The holding section 112 is provided to extend upward from one end portion of the setting section 111.
The stage device 120 includes a stage 121 and a stage driving section 122. The stage 121 is provided on the upper surface of the setting section 111. An observation target S is placed on the stage 121. Two directions orthogonal to each other in a plane on the stage 121 on which the observation target S is placed (hereinafter referred to as placement surface) are defined as an X direction and a Y direction and respectively indicated by arrows X and Y. A direction of a normal orthogonal to the placement surface of the stage 121 is defined as a Z direction and indicated by an arrow Z. A direction of rotation around an axis parallel to the Z direction is defined as a θ direction and indicated by an arrow θ.
The stage driving section 122 includes a not-shown actuator such as a stepping motor. The stage driving section 122 moves the stage 121 in the X direction, the Y direction, or the Z direction or rotates the stage 121 in the θ direction on the basis of a drive pulse given by the control board 150. The user is also capable of manually moving the stage 121 in the X direction, the Y direction, or the Z direction or rotating the stage 121 in the θ direction.
The lens barrel section 130 includes a lens unit 131 and an imaging section 132 and is disposed above the stage 121. The lens unit 131 can be replaced with another lens unit according to a type of the observation target S. The lens unit 131 is configured by an objective lens 131a and a not-shown plurality of lenses. An optical axis A1 of the objective lens 131a is parallel to the Z direction. The imaging section 132 includes, for example, a CMOS (complementary metal oxide semiconductor) camera. The imaging section 132 may include another camera such as a CCD (charge coupled device) camera.
The lens barrel section 130 is attached to the holding section 112 by the focus driving section 113 of the stand section 110. The focus driving section 113 includes a not-shown actuator such as a stepping motor. The focus driving section 113 moves the lens unit 131 in the direction of the optical axis A1 of the objective lens 131a (the Z direction) on the basis of a drive pulse given by the control board 150. Consequently, a focal position of light passed through the lens unit 131 changes to the Z direction. The user is also capable of manually moving the lens unit 131 in the direction of the optical axis A1 of the objective lens 131a.
The light projecting section 140 is integrally attached to the lens unit 131 to surround the optical axis A1 of the objective lens 131a. Consequently, it is possible to uniquely determine a positional relation between the light projecting section 140 and the lens unit 131. Since it is unnecessary to add a member that holds the light projecting section 140 in the magnifying observation apparatus 1, it is possible to reduce the magnifying observation apparatus 1 in size. An optical axis A2 (
Lights in a plurality of emitting directions are irradiated on the observation target S on the stage 121 from the light projecting section 140. Light reflected to above the stage 121 by the observation target S is condensed and focused by the lens unit 131 and thereafter received by the imaging section 132. The imaging section 132 generates image data on the basis of pixel data corresponding to light reception amounts of pixels. Each of a plurality of image data respectively generated by the imaging section 132 at the time when the lights in the plurality of emitting directions are irradiated on the observation target S by the light projecting section 140 is referred to as original image data. The imaging section 132 gives the generated plurality of original image data to a control device 400.
The control board 150 is provided in, for example, the holding section 112 of the stand section 110 and connected to the focus driving section 113, the stage driving section 122, and the imaging section 132. The control board 150 controls the operations of the focus driving section 113 and the stage driving section 122 on the basis of control by the processing device 200. A control signal is input to the imaging section 132 from the control device 400. A plurality of original image data generated by the imaging section 132 are sequentially given to the processing device 200 via a cable 203.
The processing device 200 includes a housing 210, alight generating section 300, and the control device 400. The housing 210 houses the light generating section 300 and the control device 400. The light generating section 300 is optically connected to the light projecting section 140 of the measurement head 100 by a fiber unit 201. The fiber unit 201 includes a not-shown plurality of optical fibers.
The light generating section 300 includes a light source 310 and a light blocking section 320. The light source 310 is, for example, an LED (light emitting diode). The light source 310 may be another light source such as a halogen lamp. The light blocking section 320 is disposed between the light source 310 and the fiber unit 201 to be capable of partially blocking light emitted by the light source 310. The light emitted by the light source 310 passes through the light blocking section 320 and is made incident on the fiber unit 201. Consequently, light is emitted from the light projecting section 140 of the measurement head 100 through the fiber unit 201.
The control section 410 includes a driving control section 500 and an arithmetic processing section 600. A system program is stored in the storing section 420. The storing section 420 is used for processing of various data and saving of various data given from the control section 410. Functions of the driving control section 500 and the arithmetic processing section 600 are realized by the control section 410 executing the system program stored in the storing section 420.
The driving control section 500 includes a light-projection control section 510, an imaging control section 520, a focus control section 530, and a stage control section 540. The light-projection control section 510 is connected to the light generating section 300 shown in
The imaging control section 520, the focus control section 530, and the stage control section 540 respectively control the operations of the imaging section 132, the focus driving section 113, and the stage driving section 122 through the control board 150. The imaging control section 520 sequentially gives a plurality of original image data generated by the imaging section 132 to the arithmetic processing section 600.
The arithmetic processing section 600 can generate, on the basis of at least one of the acquired plurality of original image data, image data for display indicating an image of the observation target S that should be obtained when it is assumed that light in an emitting direction designated by the user is irradiated on the observation target S. Details of the arithmetic processing section 600 are explained below. The plurality of original image data acquired by the arithmetic processing section 600 and the image data for display generated by the arithmetic processing section 600 are stored in the storing section 420.
The display section 430 is configured by, for example, an LCD (liquid crystal display) panel. The display section 430 may be configured by another display section such as an organic EL (electroluminescence) panel. The display section 430 displays, for example, an image based on the image data stored in the storing section 420 or the image data generated by the arithmetic processing section 600. The operation section 440 includes a pointing device such as a mouse, a touch panel, a trackball, or a joystick and a keyboard and is operated by the user in order to give an instruction and the like to the control device 400. The operation section 440 may include a jog shuttle in addition to the pointing device and the keyboard. The operation section 440 may include dial-like operation means, a rotation center of which faces the horizontal direction, for moving the lens barrel section 130 and the stage 121 in the up-down direction.
The communication section 450 includes an interface for connecting the control device 400 to a network. In the example shown in
A plurality of through holes 141a piercing through the holding member 141 from the upper surface to the lower surface are formed in the holding member 141. The plurality of through holes 141a are disposed at substantially equal intervals and located rotation-symmetrically around the optical axis A1 of the objective lens 131a. The plurality of optical fibers 142 are respectively inserted through the plurality of through holes 141a. Consequently, the plurality of optical fibers 142 are integrally held by the holding member 141. Incident sections and emission sections of lights in the optical fibers 142 are respectively located on the upper surface and the lower surface of the holding member 141. Consequently, a light emitting section 140o is formed on the lower surface of the holding member 141.
The plurality of optical fibers 142 are disposed on one circumference centering on the optical axis A1 of the objective lens 131a. Therefore, the distances from the optical axis A1 of the objective lens 131a to the emitting sections in the plurality of optical fibers 142 are substantially equal. An angle formed by lines, which connect the emitting sections in the optical fibers 142 and the center of the stage 121, with respect to the optical axis A1 of the objective lens 131a is an acute angle. In this embodiment, the holding member 141 integrally holds the plurality of optical fibers 142, whereby a positional relation among the plurality of optical fibers 142 is easily maintained.
As shown in
The incident sections of the plurality of optical fibers 142 are optically connected to the light generating section 300 of the processing device 200 by the fiber unit 201 shown in
The light blocking section 320 shown in
In this way, the light projecting section 140 can irradiate the lights having the emitting directions different from one another on the observation target S. Lights simultaneously emitted from the entire regions 140A to 140D are referred to as ring illumination. Light emitted from any one region of the regions 140A to 140D is referred to as directional illumination. In this embodiment, the light projecting section 140 is capable of selectively emitting the ring illumination and any one of four directional illuminations. Therefore, the imaging section 132 shown in
The four directional illuminations are lights respectively emitted from four positions (the regions 140A to 140D) different from one another by approximately 90° in the θ direction around the optical axis A1 of the objective lens 131a. The four directional illuminations are rotation-symmetrical around the optical axis A1 of the objective lens 131a. Therefore, the directional illuminations have deviation from the optical axis A1 of the objective lens 131a. The four directional illuminations are emitted in directions inclined with respect to the optical axis A1 of the objective lens 131a and different from one another. Light amounts of the four directional illuminations are substantially equal to one another. Angles of irradiation of the four directional illuminations with respect to the optical axis A1 of the objective lens 131a are not uniform according to the θ direction.
On the other hand, the ring illumination is light not deviating from the optical axis A1 of the objective lens 131a. The center of the ring illumination substantially coincides with the optical axis A1 of the objective lens 131a. Therefore, the ring illumination is emitted substantially in the direction of the optical axis A1 of the objective lens 131a. The ring illumination has a substantially uniform light amount distribution around the optical axis A1 of the objective lens 131a. A light amount of the ring illumination is substantially equal to a sum of light amounts of the four directional illuminations. That is, the light amount of the ring illumination is approximately four times as large as the light amount of each of the directional illuminations. An angle of irradiation of the ring illumination with respect to the optical axis A1 of the objective lens 131a is uniform according to the θ direction.
As explained above, in this embodiment, the plurality of regions 140A to 140D are disposed rotation-symmetrically around the optical axis A1 of the objective lens 131a. Consequently, when image data for display is generated by an arithmetic operation on the basis of the plurality of original image data, it is possible to simplify the arithmetic operation.
In this embodiment, the optical fibers 142 are provided as the light emitting members in the regions 140A to 140D of the light projecting section 140. However, the present invention is not limited to this. Light sources such as LEDs may be provided as the light emitting members in the regions 140A to 140D of the light projecting section 140. In this case, the light generating section 300 is not provided in the processing device 200. In this configuration, one or a plurality of light sources provided in each of the regions 140A to 140D emit lights, whereby the lights are emitted from the regions 140A to 140D.
Further, in this embodiment, the four regions 140A to 140D from which lights are emitted are provided in the light projecting section 140. However, the present invention is not limited to this. Three or less or five or more regions from which lights are emitted may be provided in the light projecting section 140.
In this embodiment, the plurality of light emitting members (the optical fibers 142) are disposed on one circumference centering on the optical axis A1 of the objective lens 131a. However, the present invention is not limited to this. The plurality of light emitting members may be disposed on two or more concentric circles centering on the optical axis A1 of the objective lens 131a. Further, in this embodiment, the plurality of light emitting members are disposed in each of the regions 140A to 140D. However, the present invention is not limited to this. One light emitting member may be disposed in each of the regions 140A to 140D.
In the embodiment, the light projecting section 140 is configured as a unit such that a positional relation among the plurality of light emission regions does not change. However, the present invention is not limited to this. The light projecting section 140 may be configured to be capable of changing the positional relation among the plurality of light emission regions.
The stage 121 moves in the Z direction on the basis of control by the stage control section 540 shown in
As shown in
The imaging control section 520 controls a light reception time, a gain, timing, and the like of the imaging section 132. For example, the imaging control section 520 adjusts a light reception time during irradiation of the directional illuminations on the basis of a light reception time during irradiation of the ring illumination. In this example, as explained above, the light amount of the ring illumination is approximately four times as large as the light amount of each of the directional illuminations. Therefore, the imaging control section 520 adjusts the light reception time during the irradiation of the directional illuminations to be four times as long as a light reception time during the irradiation of the ring illumination.
With this control, the imaging section 132 can generate original image data at high speed compared with when the light reception times during the irradiation of the directional illuminations are independently adjusted. The imaging section 132 can easily substantially equalize brightness of an image during the irradiation of the ring illumination and brightness of an image during the irradiation of the directional illuminations. Note that, in this example, control contents of the imaging section 132 during the irradiation of the plurality of directional illuminations are the same one another.
The imaging section 132 can generate a plurality of original image data in a state in which a light reception time is changed to a plurality of times by the imaging control section 520. The arithmetic processing section 600 shown in
An inclination angle of the optical axis A1 of the objective lens 131a with respect to the Z direction (hereinafter referred to as inclination angle of the lens barrel section 130) is detected by the inclination sensor 133. An angle signal corresponding to the inclination angle is output to the control board 150 shown in
With the configuration explained above, it is possible to selectively perform a plane observation and an inclined observation of the observation target S placed on the placement surface of the stage 121. During the plane observation, the optical axis A1 of the objective lens 131a is parallel to the Z axis. That is, the inclination angle of the lens barrel section 130 is 0°. On the other hand, during the inclined observation, the optical axis A1 of the objective lens 131a inclines with respect to the Z direction. The user can perform observation of the observation target S in a state in which the lens barrel section 130 is detached from the stand section 110 shown in
The focus control section 530 shown in
In this processing, the user can designate a range in which the focus driving section 113 moves in the Z direction. When the moving range is designated, the focus control section 530 controls the focus driving section 113 such that a focal position of light changes in the Z direction in the designated moving range. Consequently, the imaging section 132 can generate, in a short time, the plurality of original image data indicating the observation target S in the different positions in the Z direction.
The arithmetic processing section 600 can determine a focus degree of each of pixels concerning each of the generated plurality of original image data indicating the observation target S in the different positions in the Z direction. The focus control section 530 can adjust the focus driving section 113 on the basis of a determination result of the focus degree by the arithmetic processing section 600 such that the imaging section 132 is focused on a specific portion of the observation target S (autofocus processing). Further, the arithmetic processing section 600 can generate image data focused on all portions of the observation target S by selectively combining the plurality of original image data for each of the pixels on the basis of the determination result of the focus degree (depth synthesis processing).
When a positional relation in the Z direction among the observation target S, the lens unit 131, and the light projecting section 140 changes, an angle of elevation of a light source that irradiates illumination on the observation target S (see
In the example shown in
The position sensor 123 includes, for example, a linear encoder or a rotary encoder and is attached to the stage 121. The position of the stage 121 is detected by the position sensor 123. A position signal indicating the position is output to the control board 150 shown in
As explained above, the position sensor 123 is attached to the stage 121. However, the present invention is not limited to this. The position sensor 123 does not have to be attached to the stage 121. In this case, a scale indicating the position of the stage 121 may be added to the stage 121. When the arithmetic processing section 600 calculates the position of the stage 121 on the basis of the number of drive pulses from the control board 150 to the stage driving section 122 shown in
The arithmetic processing section 600 can generate image data indicating a region of the observation target S larger than a visual field (a unit region explained below) of the imaging section 132 by connecting a plurality of image data generated while the stage 121 is moved in the X direction or the Y direction (connection processing).
The operation section 440 includes a general-purpose pointing device and a general-purpose keyboard and includes a dedicated console of the control device 400 shown in
The user can give various instructions to the control device 400 by operating the joystick 441, the dial 442, or the plurality of buttons 443. The joystick 441 is used to give an instruction for moving the stage 121 shown in
A part of the plurality of buttons 443 is used to switch ON and OFF of emission of light from the light projecting section 140 shown in
The plurality of buttons 443 are also used to give instructions for causing the display section 430 shown in
The data generating section 610 generates image data for display on the basis of at least one of a plurality of original image data generated by the imaging section 132 shown in
When the focus control section 530 shown in
The calculating section 630 includes an angle calculating section 631 and a position calculating section 632. The angle calculating section 631 calculates an inclination angle of the lens barrel section 130 shown in
The position calculating section 632 calculates a position of the stage 121 shown in
The condition setting section 640 includes an imaging-condition setting section 641 and an illumination-condition setting section 642. The imaging-condition setting section 641 sets imaging conditions according to an instruction of the user. The condition setting section 640 causes the storing section 420 shown in
The illumination-condition setting section 642 sets illumination conditions according to an instruction of the user. The illumination-condition setting section 642 causes the storing section 420 to store illumination information corresponding to the set illumination conditions. The illumination conditions include an imaginary emitting direction of light with respect to the observation target S. The data generating section 610 generates image data for display on the basis of the illumination conditions set by the illumination-condition setting section 642 and causes the storing section 420 to store the image data for display. An instruction method for illumination conditions by the user is explained below.
A position where the optical axis A1 of the objective lens 131a crosses on the placement surface of the stage 121 is referred to as reference point. The observation target S is placed on the stage 121 such that an observation target portion is located on the reference point. In this state, the position in the Z direction of the lens unit 131 (
In the following explanation, in order to distinguish the four directional illuminations, lights emitted from the respective regions 140A, 140B, 140C, and 140D of the light projecting section 140 are respectively referred to as first directional illumination, second directional illumination, third directional illumination, and fourth directional illumination. In the following explanation, a traveling direction of a ray obtained when a plurality of rays forming the ring illumination are combined in terms of a vector is referred to as ring emitting direction. The ring emitting direction is a direction perpendicular to the placement surface of the stage 121. A traveling direction of a ray obtained when a plurality of rays forming the first directional illumination are combined in terms of a vector is referred to as first emitting direction. A ray obtained when a plurality of rays forming the second directional illumination are combined in terms of a vector is referred to as second emitting direction. Further, a traveling direction of a ray obtained when a plurality of rays forming the third directional illumination are combined in terms of a vector is referred to as third emitting direction. A traveling direction of a ray obtained when a plurality of rays forming the fourth directional illumination are combined in terms of a vector is referred to as fourth emitting direction.
A polar coordinate system having the reference point as the origin is defined on the placement surface of the stage 121 such that it is possible to specify an emitting direction or an emitting position of light at the time when the light is irradiated on the observation target S placed on the reference point.
A point Q is assumed in any position on the placement surface or above the placement surface. In this case, as indicated by a thick alternate long and short dash line in
In this example, center portions of the regions 140A, 140B, 140C, and 140D of the light projecting section 140 are respectively disposed at azimuth angles of 45°, 135°, 225°, and 315° around the optical axis A1. Note that the disposition of the light projecting section 140 is not limited to the example explained above. For example, the center portions of the regions 140A, 140B, 140C, and 140D may be respectively disposed at azimuth angles of 0°, 90°, 180°, and 270° around the optical axis A1. Sequentially imaging the observation target S using the ring illumination and the first to fourth directional illuminations is referred to as plural illumination imaging.
The components of the magnifying observation apparatus 1 perform the following basic operations in response to instruction for the plural illumination imaging.
As shown in
Subsequently, as shown in
Subsequently, as shown in
Subsequently, as shown in
Subsequently, as shown in
When a plurality of original image data are respectively stored in the storing section 420, a part of pixel data is curtailed from the original image data, whereby thumbnail image data corresponding to the original image data are generated. The generated thumbnail image data are stored in the storing section 420.
The series of operation explained above is automatically performed by the control section 410 shown in
When the plural illumination imaging is completed, an observation screen is displayed on the display section 430 shown in
The user can operate the buttons displayed in the function display region 431 using the console 440A shown in
As shown in
In the sub-display region 433, an emitting-direction designation field 433a and an emitting-direction display field 433b are displayed. In the emitting-direction designation field 433a, a target position image ss0 indicating the position of the observation target S on the placement surface is displayed. A light icon ss1 indicating an emitting position of light with respect to the observation target S at the time when the observation target S is viewed from a position above the light projecting section 140 is superimposed and displayed on the target position image ss0.
In this case, a relative positional relation between the target portion image sp of the observation target S and the light icon ss1 on the target position image ss0 corresponds to an emitting direction of light that should be irradiated on the observation target S in order to obtain the image SI displayed in the main display region 432 (hereinafter referred to as imaginary emitting direction of light). As the target position image ss0, for example, a thumbnail image corresponding to the ring illumination can be used.
By operating the console 440A shown in
The imaginary emitting direction of light is designated by the user, whereby the image SI of the observation target S displayed in the main display region 432 is updated to the image SI of the observation target S that should be obtained when it is assumed that light in a designated emitting direction is irradiated on the observation target S. The update processing of the image SI is executed by the data generating section 610 shown in
In the emitting-direction display field 433b, an image indicating the reference point on the placement surface is displayed as a reference point image ss2 and an image of an imaginary hemisphere covering the reference point on the stage 121 is stereoscopically displayed as a hemispherical image ss3. On the hemispherical image ss3, an image indicating an emitting direction of light corresponding to an imaginary emitting direction of light designated by the light icon ss1 is displayed as an emitting position image ss4.
Further, a straight line is displayed to connect the emitting position image ss4 and the reference point image ss2 on the hemispherical image ss3. In this case, a direction from the emitting position image ss4 to the reference point image ss2 on the straight line indicates the emitting direction of light designated by the light icon ss1. The user can easily and accurately recognize the virtual emitting direction of light designated by the light icon ss1 by visually recognizing the reference point image ss2, the hemispherical image ss3, and the emitting position image ss4 displayed in the emitting-direction display field 433b.
The magnifying observation apparatus 1 may be configured to be capable of selecting only a part of the ring illumination and the first to fourth directional illuminations as illumination used for the plural illumination imaging. When the plural illumination imaging is performed using only a part of the ring illumination and the first to fourth directional illuminations, a range of an emitting direction that can be designated by the light icon ss1 is sometimes limited.
In this case, in the hemispherical image ss3, the range of the emitting direction that can be designated by the light icon ss1 in the emitting-direction designation field 433a may be displayed to be distinguishable from a range of an emitting direction that cannot be designated. For example, when the range of the emitting direction that can be designated is limited to a range of a specific azimuth angle, a display form such as a color may be differentiated between a portion corresponding to a range of an azimuth angle that can be designated and a portion corresponding to a range of an azimuth angle that cannot be designated. Consequently, the user can easily recognize the range of the emitting direction that can be designated by the light icon ss1. Alternatively, in the hemispherical image ss3, instead of the example explained above, only the range of the emitting direction that can be designated by the light icon ss1 may be displayed.
In this example, in the emitting-direction display field 433b, the imaginary hemisphere covering the reference point is stereoscopically displayed as the hemispherical image ss3. However, the present invention is not limited to this. In the emitting-direction display field 433b, a plane hemispherical image obtained by viewing, from above, the imaginary hemisphere covering the reference point and a side hemispherical image obtained by viewing the imaginary hemisphere from one side may be displayed. In this case, the reference point image ss2 and the emitting position image ss4 may be displayed on the plane hemispherical image. The reference point image ss2 and the emitting position image ss4 may be displayed on the side hemispherical image.
In the magnifying observation apparatus 1, a plane coordinate system decided in advance concerning the target position images ss0 displayed in the sub-display region 433 shown in
Positions of the points PA to PE are set, for example, on the basis of a relative positional relation between the light projecting section 140 and the stage 121. In this example, the point PA is located in the center of the target position image ss0. The points PB, PC, PD, and PE are arranged at equal angle intervals on a concentric circle centering on the point PA.
The control section 410 detects a position (a coordinate) on the target position image ss0 of the light icon ss1 at a cycle decided in advance and causes the display section 430 to display, in the main display region 432 shown in
For example, the control section 410 causes the display section 430 to display, in the main display region 432 shown in
When the light icon ss1 is present in a position different from the points PA to PE, the control section 410 generates, according to a procedure explained below, the image SI of the observation target S that the control section 410 should cause the main display region 432 to display.
As shown in
Subsequently, the control section 410 determines, on the basis of the distances d1, d2, and d3, combination rates of original image data corresponding to the point PA, original image data corresponding to the point PB, and original image data corresponding to the point PC.
The combination rates are, for example, ratios of inverses of values of the distances d1, d2, and d3. In this case, the combination rate is higher in original image data corresponding to a point, the distance to which from the light icon ss1 is shorter, and is lower in original image data corresponding to a point, the distance to which from the light icon ss1 is longer. In the example shown in
The control section 410 combines the three original image data respectively corresponding to the points PA, PB, and PC on the basis of the determined combination rates. Specifically, the control section 410 multiplies, for each of the original image data, value (pixel values) of all pixel data of the original image data with the combination rate and combines the three original image data after the multiplication to thereby generate image data for display. Thereafter, the control section 410 causes the display section 430 to display, in the main display region 432 shown in
In
As explained above, after the completion of the plural illumination imaging, the image data of the image SI displayed in the main display region 432 is generated on the basis of the generated plurality of original image data and the imaginary emitting direction of light designated by the user. Therefore, even if the imaginary emitting direction of light designated by the user continuously changes, a plurality of image data corresponding to the designated emitting direction are substantially continuously generated at speed corresponding to a processing ability of the control section 410. The image SI based on the generated plurality of image data is continuously displayed. Therefore, a video substantially the same as a video (a moving image) obtained by continuously performing imaging while changing the position of illumination is simulatively reproduced on the main display region 432. Consequently, the user visually recognizes the image SI on the main display region 432 while designating the imaginary emitting direction of light to thereby feel as if the light in the designated emitting direction is irradiated on the observation target S on a real time basis.
In the example explained above, the control section 410 extracts the three points among the points PA to PE corresponding to the ring illumination and the first to fourth directional illuminations and determines the combination rates of the three original image data corresponding to the extracted three points. However, the present invention is not limited to this. The control section 410 may determine combination rates concerning five original image data respectively corresponding to all the points PA to PE and combine the five original image data on the basis of the determined combination rates.
Note that, on the observation screen 430A shown in
On the observation screen 430A shown in
In this case, the points PA to PE are set on the image SI of the observation target S as well. The user can move, on the observation screen 430A, using, for example, the console 440A shown in
When the light icon ss1 on the target position image ss0 is moved, the control section 410 generates image data for display in a procedure same as the procedure in the example explained with reference to
When the light icon ss1 on the image SI of the observation target S is moved, the control section 410 generates image data for display on the basis of a positional relation between the position of the light icon ss1 on the image SI and the points PA to PE set in the image SI and displays the image SI based on the generated image data for display in the main display region 432. At this point, the control section 410 adjusts the position of the light icon ss1 on the target position image ss0 to move the light icon ss1 to a position corresponding to the position of the light icon ss1 on the image SI.
According to the example shown in
Note that the light icon ss1 on the target position image ss0 may be displayed irrespective of the position of the mouse pointer. Consequently, the user can easily grasp an imaginary emitting position of light.
The user can move the light icon ss1 by moving the mouse pointer in the main display region 432 and designate the imaginary emitting position of light. Consequently, the user can continuously change the designation of the imaginary emitting position of light by moving the mouse pointer in the main display region 432. Note that, when the designation of the imaginary emitting position of light is enabled in the main display region 432, the designation of the imaginary emitting position of light may be unable to be changed even if the mouse pointer is moved on the inside of the emitting-direction designation field 433a.
The light icon ss1 may be disposed in the emitting-direction display field 433b. In this case, the light icon ss1 displayed in the emitting-direction display field 433b is an icon for stereoscopically showing the imaginary emitting position of light to the user. The light icon ss1 displayed in the emitting-direction display field 433b may be able to be operated by the mouse pointer or may be unable to be operated by the mouse pointer as the icon for stereoscopically showing the imaginary emitting position of light. A numerical-value display section that displays the imaginary emitting position of light in terms of a numerical value using an angle of elevation and an azimuth angle may be further provided.
Further, as explained above, a plurality of light icons ss1 simultaneously displayed in a plurality of regions on the observation screen 430A may move in association with one another. When the imaginary emitting position of light is changed by operating the light icon ss1 displayed in the main display region 432 with the mouse pointer, the light icon ss1 displayed in the emitting-direction designation field 433a and the emitting-direction display field 433b may move in association with the movements of the mouse pointer and the light icon ss1 displayed in the main display region 432. In this way, in the main display region 432 in which a two-dimensional observation image is displayed, it is also possible to designate the angle of elevation and the azimuth angle by designating the imaginary emitting position of light. By operating the light icon ss1 in the main display region 432, the display of the imaginary emitting position of light in the emitting-direction display field 433b moves in association with the operation of the light icon ss1. Illumination at the time when it is assumed that light is irradiated from a position having the angle of elevation is irradiated on the observation target S.
In the emitting-direction display field 443b, a hemispherical schematic diagram indicating the imaginary emitting position of light and the angle of elevation and the azimuth angle is displayed to make it easy to understand the imaginary emitting position of light and the angle of elevation and the azimuth angle.
A specific example is explained concerning a change in the observation screen 430A at the time when an imaginary emitting direction of light is designated by the light icon ss1 in a state in which the observation screen 430A shown in
As indicated by a dotted line in the emitting-direction designation field 433a shown in
In the image SI displayed in the main display region 432 shown in
The imaginary emitting direction of light includes components of an azimuth angle and an angle of elevation. In this example, the light icon ss1 is moved on the concentric circle centering on the point PA on which the points PB to PE are arranged. The user can designate an azimuth angle of the imaginary emitting direction of light by moving the light icon ss1 on the target position image ss0 to rotate with respect to the center of the target position image ss0. Consequently, the azimuth angle of the imaginary emitting direction of light is changed to the azimuth angle designated by the light icon ss1. In this way, the user can designate the azimuth angle of the imaginary emitting direction of light in a desired direction by operating the light icon ss1. As a result, in the image SI displayed in the main display region 432, the user can easily change a direction in which the uneven portion in the observation target S is emphasized to the θ direction.
In the image SI displayed in the main display region 432 shown in
In this example, the light icon ss1 is moved from the concentric circle on which the points PB to PE are arranged centering on the point PA toward the point PA. The user can designate an angle of elevation of the imaginary emitting direction of light from the placement surface by moving the light icon ss1 close to or away from the center of the target position image ss0 on the target position image ss0. Consequently, the angle of elevation of the imaginary emitting direction of light from the placement surface is changed to the angle of elevation designated by the light icon ss1. In this way, the user can designate the angle of elevation of the imaginary emitting direction of light in a desired angle by operating the light icon ss1. As a result, in the image SI displayed in the main display region 432, the user can easily change a degree of the emphasis of the uneven portion in the observation target S.
In the example explained above, on the observation screen 430A of the display section 430, the main display region 432 for displaying the image SI based on the image data for display and the sub-display region 433 for operating the light icon ss1 are set. Consequently, the image SI of the observation target S and the light icon ss1 do not overlap. Therefore, it is easy to visually recognize the image SI of the observation target S and the light icon ss1.
The user operates the saving button b7 shown in
A configuration for designating the imaginary emitting direction of light is not limited to the example explained above.
In the example shown in
Concerning the example shown in
In the example shown in
The light icon ss5 indicates a position (a coordinate) in the left-right direction on the image SI. The light icon ss6 indicates a position (a coordinate) in the up-down direction on the image SI. In this example, as in the example shown in
Concerning the example shown in
In the example shown in
The switch icons ss7 are displayed to be capable of being switched to an ON state and an OFF state. In this case, the control section 410 may generate image data for display on the basis of one or a plurality of images SI corresponding to the switch icon(s) ss7 in the ON state. For example, it is assumed that the switch icon ss7 located in the center of the image SI and the switch icon ss7 located on the upper right of the image SI are switched to the ON state and the remaining all switch icons ss7 are switched to the OFF state. In this case, the control section 410 may generate image data for display by combining two original image data corresponding to the points PA and PB.
When the plurality of switch icons ss7 are disposed in the same manner as emitting positions of lights corresponding to the ring emitting direction and the first to fourth emitting directions, the user can easily grasp a light emitting position in the light projecting section 140 on the image SI.
Concerning the example shown in
Besides the examples shown in
In the example explained above, one of the plurality of images SI based on the plurality of original image data generated by the immediately preceding plural illumination imaging is displayed in the main display region 432 of the observation screen 430A in the initial state after the completion of the plural illumination imaging. However, the present invention is not limited to this. In the magnifying observation apparatus 1, for example, before the plural illumination imaging is started, an imaginary emitting direction of light corresponding to the image SI that should be displayed in the initial state in advance may be able to be designated. Alternatively, the imaginary emitting direction of light corresponding to the image SI that should be displayed in the initial state in advance may be designated by a manufacturer in advance during factory shipment of the magnifying observation apparatus 1. An image (a moving image) in which an emitting direction with respect to the image SI smoothly changes such that the imaginary emitting direction of light corresponding to the image SI changes may be displayed in the main display region 432 of the observation screen 430A in the initial state after the completion of the plural illumination imaging until operation by the user is received.
In this case, the control section 410 generates, when the plural illumination imaging is completed, image data for display corresponding to an emitting direction designated on the basis of the imaginary emitting direction of light designated in advance and the plurality of original image data generated by the plural illumination imaging. The control section 410 causes the display section 430 to display, in the main display region 432, the image SI based on the generated image data for display.
The system program stored in the storing section 420 shown in
Subsequently, the control section 410 sets i to 1 (step S102). In this step, i indicates numbers of a plurality of directional illuminations. Subsequently, the control section 410 irradiates i-th directional illumination on the observation target S and images the observation target S with the imaging section 132 (step S103). Original image data obtained by the imaging is stored in the storing section 420.
Subsequently, the control section 410 determines whether i is 4 (step S104). If i is not 4, the control section 410 updates i to i+1 (step S105) and returns to the processing in step S103.
If i is 4 in step S104, the control section 410 generates a plurality of thumbnail image data respectively corresponding to a plurality of original image data (step S106). The generated plurality of thumbnail image data are stored in the storing section 420. Consequently, the plural illumination imaging processing ends.
In the above explanation, the processing in step S106 may be omitted when it is unnecessary to display a thumbnail image on the display section 430. Consequently, a processing time is reduced.
In the above explanation, a part of the processing may be performed at other points in time. For example, the processing in step S101 may be executed later than the processing in steps S102 to S105.
First, the control section 410 causes the display section 430 to display, in the main display region 432, the image SI of the observation target S based on any original image data among the plurality of original image data generated by the plural illumination imaging processing (step S201). The control section 410 causes the display section 430 to display the target position image ss0 and the light icon ss1 in the emitting-direction designation field 433a (step S202). Further, the control section 410 causes the display section 430 to display the reference point image ss2, the hemispherical image ss3, and the emitting position image ss4 in the emitting-direction display field 433b (step S203).
Thereafter, the control section 410 determines whether the light icon ss1 is operated (step S204). If the light icon ss1 is not operated, the control section 410 proceeds to processing in step S210 explained below.
If the light icon ss1 is operated, the control section 410 updates the display of the light icon ss1 and the emitting position image ss4 in response to the operation of the light icon ss1 (step S205). The control section 410 recognizes that an imaginary emitting direction of light is designated by the operation of the light icon ss1 (step S206) and determines whether the designated emitting direction is the ring emitting direction or any one of the first to fourth emitting directions (step S207). The determination processing in step S207 is executed on the basis of a positional relation between the points PA to PE set on the target position image ss0 and the light icon ss1.
If the designated emitting direction is the ring emitting direction or any one of the first to fourth emitting directions, the control section 410 sets the original image data corresponding to the designated emitting direction as image data for display and causes the display section 430 to display the image SI of the observation target S based on the image data for display in the main display region 432 (step S208). Thereafter, the control section 410 proceeds to processing in step S210 explained below.
If the designated emitting direction is not the ring emitting direction or all of the first to fourth emitting directions in step S207, the control section 410 calculates combination rates of the plurality of original image data on the basis of the designated emitting direction (step S209). Like the processing in step S207, the calculation processing in step S209 is executed on the basis of the positional relation between the points PA to PE set on the target position image ss0 and the light icon ss1.
Thereafter, the control section 410 generates image data for display by combining the plurality of original image data on the basis of the combination rates calculated in step S209 and causes the display section 430 to display the image SI of the observation target S based on the image data for display in the main display region 432 (step S210).
In this embodiment, the user can instruct an end of the observation of the observation target S by operating the console 440A shown in
In the image-for-display generation processing shown in
In the image-for-display generation processing explained above, the image SI of the observation target S based on any original image data among the plurality of original image data generated in the plural illumination imaging processing is displayed on the display section 430 in the processing in step S201. However, the present invention is not limited to this. In the processing in step S201, the control section 410 may cause the display section 430 to display the image SI based on original image data corresponding to illumination decided in advance (e.g., the ring illumination) instead of the any original image data. Alternatively, the control section 410 may omit the processing in step S201.
In the example explained above, the image-for-display generation processing is executed after the end of the plural illumination imaging processing. However, the present invention is not limited to this. When the plural illumination imaging processing is continuously or intermittently executed at a fixed cycle, the image-for-display generation processing may be executed in parallel to the plural illumination imaging processing. In this case, the image-for-display generation processing can be executed on the basis of a latest plurality of original image data stored in the storing section 420 by the immediately preceding plural illumination imaging processing.
In the magnifying observation apparatus 1 according to this embodiment, the user can designate a part of the plurality of original image data stored in the storing section 420 using the console 440A shown in
The user can give an instruction for the depth synthesis processing to the control section 410 shown in
In the depth synthesis processing, it is desirable that a range of a focal position of light and a moving pitch of the focal position in the Z direction are set in advance as imaging conditions for the depth synthesis processing. In this case, the focus driving section 113 shown in
In the depth synthesis processing, the observation target S is imaged using the ring illumination and the first to fourth directional illuminations in a state in which the lens unit 131 is positioned in each of the positions H1 to Hj. Consequently, pluralities of (j) original image data respectively corresponding to the positions H1 to Hj are generated using the ring illumination and the first to fourth directional illuminations. In
A focus degree of each of pixels is determined concerning each of the plurality of original image data obtained by the imaging using the ring illumination. The plurality of original image data are selectively combined on the basis of a determination result of the focus degree. Consequently, depth synthesis image data focused on all portions of the observation target S on which the ring illumination is irradiated is generated. Depth synthesis image data corresponding to the directional illuminations are generated on the basis of the pluralities of original image data corresponding to the directional illuminations and mask image data explained below. An image based on the depth synthesis image data is referred to as depth synthesis image. In
The control section 410 executes the image-for-display generation processing on the basis of a plurality of depth synthesis image data generated by the depth synthesis processing instead of the plurality of original image data. Consequently, the user can easily cause the display section 430 to display a depth synthesis image SF of the observation target S that should be obtained when it is assumed that light is irradiated on the observation target S from a desired direction.
In the depth synthesis processing explained above, pluralities of original image data respectively corresponding to the positions H1 to Hj are generated for each of the illuminations. An operation button for designating only the generation of the pluralities of original image data may be displayed on the observation screen 430A shown in
In the depth synthesis processing, mask image data is generated when depth synthesis image data corresponding to the ring illumination is generated. A plurality of depth synthesis image data respectively corresponding to the first to fourth directional illuminations are generated using the generated mask image data. The mask image data is explained.
Numbers corresponding to the focal positions H1 to Hj of light in the Z direction are given to respective pluralities of original image data generated in processing for imaging the observation target S while changing the positions in the Z direction of the lens barrel section 130 and the stage 121. The data generating section 610 shown in
The data generating section 610 generates, on the basis of the generated mask image data, depth synthesis image data corresponding to the respective first to fourth directional illuminations. In this case, the focus determining section 620 shown in
In the above explanation, the depth synthesis image data is generated first for each of the ring illumination and the first to fourth directional illuminations. The image data for display is generated on the basis of the generated plurality of depth synthesis image data. However, the present invention is not limited to this. Pluralities of image data for display respectively corresponding to the plurality of positions H1 to Hj in the Z direction may be generated on the basis of the imaginary emitting direction of light designated by the user. Depth synthesis image data for display may be generated on the basis of the generated pluralities of image data for display. In this case, the mask image data is unnecessary.
Note that, in the above explanation, the mask image data is generated on the basis of the image data at the time when the ring illumination is emitted. However, the present invention is not limited to this. The mask image data may be generated on the basis of each of image data at the time when the ring illumination is emitted and image data at the time when the directional illumination is emitted. The mask image data at the time when the directional illumination is emitted may be generated for each of the plurality of directional illuminations. This configuration is useful when an optimum position in the Z direction is different because, for example, light amounts are different in the ring illumination and the directional illumination.
The system program stored in the storing section 420 shown in
If the lens unit 131 has not moved to the upper limit position in step S304, the control section 410 moves the lens unit 131 upward by a predetermined amount (a moving pitch set in advance) (step S305). Thereafter, the control section 410 returns to the processing in step S302. The control section 410 repeats the processing insteps S302 to S305 until the lens unit 131 moves to the upper limit position.
If the lens unit 131 has moved to the upper limit position in step S304, the control section 410 determines a focus degree of each of pixels concerning original image data corresponding to the ring illumination (step S306). Subsequently, the control section 410 generates depth synthesis image data corresponding to the ring illumination by combining pixel data of a plurality of original image data on the basis of a determination result of the focus degree (step S307). The control section 410 generates mask image data indicating a correspondence relation between pixels of combined image data and numbers of the original image data and causes the storing section 420 to store the mask image data (step S308).
Thereafter, the control section 410 moves the lens unit 131 to the lower limit position (step S309). Subsequently, the control section 410 sets i to 1 (step S310). In this step, i indicates numbers of a plurality of directional illuminations. Subsequently, the control section 410 irradiates the i-th directional illumination on the observation target S with the light projecting section 140 and images the observation target S with the imaging section 132 (step S311). The control section 410 gives a number corresponding to the position in the Z direction of the lens unit 131 to the generated original image data (step S312). Subsequently, the control section 410 determines whether the lens unit 131 has moved to the upper limit position (step S313).
If the lens unit 131 has not moved to the upper limit position in step S313, the control section 410 moves the lens unit 131 upward by a predetermined amount (a moving pitch set in advance) (step S314). Thereafter, the control section 410 returns to the processing in step S311. The control section 410 repeats the processing insteps S311 to S314 until the lens unit 131 moves to the upper limit position.
If the lens unit 131 has moved to the upper limit position in step S313, the control section 410 generates depth synthesis image data corresponding to the i-th directional illumination by combining pixel data of the plurality of original image data on the basis of the mask image data stored in the storing section 420 (step S315).
Subsequently, the control section 410 determines whether i is 4 (step S316). If i is not 4 in step S316, the control section 410 updates i to i+1 (step S317). The control section 410 moves the lens unit 131 to a lower limit position (step S318). Thereafter, the control section 410 returns to the processing in step S311. The control section 410 repeats the processing in steps S311 to S318 until i reaches 4. Consequently, pluralities of original image data corresponding to the respective first to fourth directional illuminations are generated. Depth synthesis image data corresponding to the respective first to fourth directional illuminations are generated. If i is 4 in step S316, the control section 410 ends the processing.
In the above explanation, a part of the processing may be performed at another point in time. For example, the processing in steps S306 to S308 may be executed in parallel to steps S309 to S314. The processing in step S315 corresponding to the i-th directional illumination may be executed in parallel to steps S311 to S314 corresponding to (i+1)-th directional illumination. In these cases, it is possible to increase the speed of the depth synthesis processing.
Alternatively, the processing in steps S306 to S308 may be executed later than the processing in steps S309 to S314. The processing in step S315 corresponding to the i-th directional illumination may be executed later than the processing in steps S311 to S314 corresponding to the (i+1)-th directional illumination.
Subsequently, the control section 410 sets i to 1 (step S324). In this step, i indicates numbers of a plurality of directional illuminations. Thereafter, the control section 410 irradiates the i-th directional illumination on the observation target S with the light projecting section 140 and images the observation target S with the imaging section 132 (step S325). The control section 410 gives a number corresponding to the position in the Z direction of the lens unit 131 to generated original image data (step S326). Subsequently, the control section 410 determines whether i is 4 (step S327).
If i is not 4 in step S327, the control section 410 updates i to i+1 (step S328). Thereafter, the control section 410 returns to the processing in step S325. The control section 410 repeats the processing in steps S325 to S328 until i reaches 4. Consequently, pluralities of original image data corresponding to the respective first to fourth directional illuminations are generated. If i is 4 in step S327, the control section 410 determines whether the lens unit 131 has moved to the upper limit position (step S329).
If the lens unit 131 has not moved to the upper limit position in step S329, the control section 410 moves the lens unit 131 upward by a predetermined amount (step S330). Thereafter, the control section 410 returns to the processing in step S322. The control section 410 repeats the processing insteps S322 to S330 until the lens unit 131 moves to the upper limit position.
If the lens unit 131 has moved to the upper limit position in step S329, the control section 410 determines a focus degree of each of pixels concerning the original image data corresponding to the ring illumination (step S331). Subsequently, the control section 410 generates depth synthesis image data corresponding to the ring illumination by combining pixel data of the plurality of original image data on the basis of a determination result of the focus degree (step S332). The control section 410 generates mask image data indicating a correspondence relation between pixels of combined image data and numbers of the original image data and causes the storing section 420 to store the mask image data (step S333).
Subsequently, the control section 410 sets i to 1 again (step S334). Thereafter, the control section 410 generates depth synthesis image data corresponding to the i-th directional illumination by combining the pixel data of the plurality of original image data on the basis of the mask image data stored in the storing section 420 (step S335).
Subsequently, the control section 410 determines whether i is 4 (step S336). If i is not 4 in step S316, the control section 410 updates i to i+1 (step S337). Thereafter, the control section 410 returns to the processing in step S335. The control section 410 repeats the processing in steps S335 to S337 until i reaches 4. Consequently, depth synthesis image data corresponding to the respective first to fourth directional illuminations are generated. If i is 4 in step S336, the control section 410 ends the processing.
In the above explanation, a part of the processing may be performed at another point in time. For example, a part of the processing in steps S331 to S337 may be executed in parallel to steps S321 to S330. In this case, it is possible to increase the speed of the depth synthesis processing. The processing in steps S322 and S323 may be executed later than the processing in steps S324 to S328.
In the example and the other example of the depth synthesis processing, the lens unit 131 is moved upward to the upper limit position by the predetermined amount at a time after being moved to the lower limit position serving as the initial position. However, the present invention is not limited to this. In the depth synthesis processing, the lens unit 131 may be moved downward to the lower limit position by the predetermined amount at a time after being moved to the upper limit position serving as the initial position.
Note that, in the above explanation, the pluralities of original image data respectively corresponding to the positions H1 to Hj are combined in the depth synthesis processing. However, the present invention is not limited to this. The pluralities of original image data respectively corresponding to the positions H1 to Hj may be independently used without being combined.
For example, original image data in which a focus of the imaging section 132 coincides with a specific portion of the observation target S most may be extracted from the pluralities of original image data respectively corresponding to the positions H1 to Hj on the basis of the determination result by the focus determining section 620 shown in
The user can give an instruction for the DR adjustment processing to the arithmetic processing section 600 by operating the DR adjustment button b2 shown in
In the DR adjustment processing, in a state in which the light reception time of the imaging section 132 is changed to a plurality of values decided in advance, the observation target S at the time when the ring illumination and the first to fourth directional illuminations are respectively irradiated is imaged. Consequently, a plurality of original image data respectively corresponding to the ring illumination and the first to fourth directional illuminations are generated by the data generating section 610 shown in
The plurality of original image data corresponding to the ring illumination are combined by the data generating section 610. Consequently, it is possible to adjust a dynamic range of the original image data corresponding to the ring illumination. Similarly, the pluralities of original image data corresponding to the directional illuminations are combined by the data generating section 610. Consequently, it is possible to adjust a dynamic range of the original image data corresponding to the directional illuminations.
The adjustment of the dynamic range includes expansion and reduction of the dynamic range. It is possible to reduce black solid and halation (white void) in an image by combining the pluralities of original image data to expand the dynamic range. On the other hand, a difference in light and shade of an image is increased by combining the pluralities of original image data to reduce the dynamic range. Consequently, it is possible to precisely observe unevenness of the observation target S having a smooth surface.
In the above explanation, the original image data are combined first such that the dynamic range is adjusted for each of the ring illumination and the first to fourth directional illuminations. The image data for display is generated on the basis of the combined pluralities of original image data. However, the present invention is not limited to this. The image data for display may be generated first on the basis of the pluralities of original image data in every light reception time of the imaging section 132. The image data for display generated in every light reception time of the imaging section 132 may be combined such that the dynamic range is adjusted.
The system program stored in the storing section 420 shown in
If the observation target S has been imaged in not all of the desired light reception times of the imaging section 132 in step S403, the control section 410 sets the light reception time of the imaging section 132 to the next value decided in advance (step S404). Thereafter, the control section 410 returns to the processing in step S402. The control section 410 repeats the processing in steps S402 to S404 until the observation target S is imaged in all of the desired light reception times of the imaging section 132.
If the observation target S has been imaged in all of the desired light reception times of the imaging section 132 in step S403, the control section 410 combines a generated plurality of original image data corresponding to the ring illumination (step S405). Consequently, a dynamic range of the original image data corresponding to the ring illumination is adjusted.
Thereafter, the control section 410 sets i to 1 (step S406). In this step, i indicates numbers of a plurality of directional illuminations. Subsequently, the control section 410 sets the light reception time of the imaging section 132 to an initial value decided in advance (step S407). In this state, the control section 410 irradiates the i-th directional illumination on the observation target S with the light projecting section 140 and images the observation target S with the imaging section 132 (step S408). Subsequently, the control section 410 determines whether the observation target S has been imaged in all of the desired light reception times of the imaging section 132 in a state in which the i-th directional illumination is irradiated (step S409).
If the observation target S has been imaged in not all of the desired light reception times of the imaging section 132 in step S409, the control section 410 sets the light reception time of the imaging section 132 to the next value decided in advance (step S410). Thereafter, the control section 410 returns to the processing in step S408. The control section 410 repeats the processing in steps S408 to S410 until the observation target S is imaged in all of the desired light reception times of the imaging section 132.
If the observation target S has been imaged in all of the desired light reception times of the imaging section 132 in step S409, the control section 410 combines a generated plurality of original image data corresponding to the i-th directional illumination (step S411). Consequently, a dynamic range of the original image data corresponding to the i-th directional illumination is adjusted.
Subsequently, the control section 410 determines whether i is 4 (step S412). If i is not 4 in step S412, the control section 410 updates i to i+1 (step S413). Thereafter, the control section 410 returns to the processing in step S407. The control section 410 repeats the processing in steps S407 to S413 until i reaches 4. Consequently, pluralities of original image data corresponding to the respective first to fourth directional illuminations are generated and combined to adjust a dynamic range. If i is 4 in step S412, the control section 410 ends the processing.
In the above explanation, a part of the processing may be performed at another point in time. For example, the processing in step S405 may be executed in parallel to the processing in steps S406 to S413. The processing in step S411 corresponding to the i-th directional illumination may be executed in parallel to steps S407 to S410 corresponding to the (i+1)-th directional illumination. In these cases, it is possible to increase the speed of the DR adjustment processing.
Alternatively, the processing in steps S401 to S405 may be executed later than the processing in steps S406 to S413. The processing in step S411 corresponding to the i-th directional illumination may be executed later than the processing insteps S407 to S410 corresponding to the (i+1)-th directional illumination.
Subsequently, the control section 410 sets i to 1 (step S423). In this step, i indicates numbers of a plurality of directional illuminations. Thereafter, the control section 410 irradiates the i-th directional illumination on the observation target S with the light projecting section 140 and images the observation target S with the imaging section 132 (step S424).
Subsequently, the control section 410 determines whether i is 4 (step S425). If i is not 4 in step S425, the control section 410 updates i to i+1 (step S426). Thereafter, the control section 410 returns to the processing in step S424. The control section 410 repeats the processing in steps S424 to S426 until i reaches 4. Consequently, pluralities of original image data corresponding to the respective first to fourth directional illuminations are generated. Subsequently, the control section 410 determines whether the observation target S has been imaged in all of the desired light reception times of the imaging section 132 (step S427).
If the observation target S has been imaged in not all of the desired light reception times of the imaging section 132 in step S427, the control section 410 sets the light reception time of the imaging section 132 to the next value decided in advance (step S428). Thereafter, the control section 410 returns to the processing in step S422. The control section 410 repeats the processing in steps S422 to S428 until the observation target S is imaged in all of the desired light reception time of the imaging section 132.
If the observation target S has been imaged in all of the desired light reception times of the imaging section 132 in step S427, the control section 410 combines a generated plurality of original image data corresponding to the ring illumination to adjust a dynamic range (step S429). Consequently, a dynamic range of the original image data corresponding to the ring illumination is adjusted.
Thereafter, the control section 410 sets i to 1 again (step S430). Subsequently, the control section 410 combines a generated plurality of original image data corresponding to the i-th directional illumination (step S431). Consequently, a dynamic range of the original image data corresponding to the i-th directional illumination is adjusted. Subsequently, the control section 410 determines whether i is 4 (step S432).
If i is not 4 in step S432, the control section 410 updates i to i+1 (step S433). Thereafter, the control section 410 returns to the processing in step S431. The control section 410 repeats the processing in steps S431 to S433 until i reaches 4. Consequently, the pluralities of original image data corresponding to the respective first to fourth directional illuminations are combined to adjust a dynamic range. If i is 4 in step S432, the control section 410 ends the processing.
In the above explanation, a part of the processing may be performed at another point in time. For example, a part of the processing in steps S429 to S433 may be performed in parallel to the processing in steps S421 to S428. In this case, it is possible to increase the speed of the DR adjustment processing. The processing in step S422 may be executed later than the processing in steps S423 to S426. Further, the processing in step S429 may be executed later than the processing in steps S430 to S433.
The surface of the observation target S, original image data of which is generated by imaging performed by the imaging section 132 shown in
It is possible to generate image data for display by combining the pluralities of connected image data including the connected image data CG1 and CG2. An irradiating position of illumination and an overlapping portion are sometimes different for each of the connected image data. In this case, as shown in
The user can give an instruction for the connection processing to the arithmetic processing section 600 by operating the connection button b3 shown in
The stage 121 is moved such that parts of the original image data OG1 to OG4 adjacent to one another overlap one another. In the example shown in
Positions of the stage 121 at the time when the original image data OG1 to OG4 are generated are calculated by the position calculating section 632 shown in
Similarly, the stage 121 is sequentially moved in the X direction while the directional illuminations are irradiated on the observation target S. In this state, a plurality of original image data adjacent to one another in the X direction are sequentially generated by the imaging section 132. The data generating section 610 corrects the positions of the generated original image data on the basis of the position information and the overlapping region information stored in the storing section 420 and connects the original image data adjacent to one another after the correction.
With this procedure of the connection processing, it is possible to connect, at high accuracy, the original image data adjacent to one another corresponding to the directional illuminations without performing the pattern matching. Since it is unnecessary to perform the pattern matching, it is possible to increase the speed of the connection processing. Further, the sizes of the image data after the connection corresponding to the directional illuminations coincide with the sizes of the image data after the connection corresponding to the ring illumination. Consequently, it is possible to easily generate image data for display indicating a region of the observation target S larger than the unit region using a plurality of image data after the connection.
In the connection processing, the plurality of image data after the connection are generated first. The image data for display indicating the region of the observation target S larger than the unit region is generated using the generated plurality of image data after the connection. However, the present invention is not limited to this. Image data for display in a plurality of positions of the stage 121 may be generated first. The image data for display indicating the region of the observation target S larger than the unit region may be generated by connecting the generated plurality of image data for display. In this case, a plurality of image data for display are desirably connected using the pattern matching. The plurality of image data for display are desirably connected using overlapping region information at the time when the ring illumination is emitted.
The system program stored in the storing section 420 shown in
If not all of the desired regions of the observation target S have been imaged in step S504, the control section 410 moves the stage 121 by a predetermined amount (step S505). Thereafter, the control section 410 returns to the processing in step S502. The control section 410 repeats the processing in steps S502 to S505 until all of the desired regions of the observation target S are imaged.
If all of the desired regions of the observation target S have been imaged in step S504, the control section 410 connects a generated plurality of original image data corresponding to the ring illumination (step S506). The control section 410 causes the storing section 420 to store overlapping region information indicating overlapping regions at the time when the original image data adjacent to one another are connected (step S507).
Thereafter, the control section 410 moves the stage 121 to the initial position (step S508). Subsequently, the control section 410 sets i to 1 (step S509). In this step, i indicates numbers of a plurality of directional illuminations. Subsequently, the control section 410 irradiates the i-th directional illumination on the observation target S with the light projecting section 140 and images the observation target S with the imaging section 132 (step S510). Subsequently, the control section 410 determines whether all of the desired regions of the observation target S have been imaged in a state in which the i-th directional illumination is irradiated (step S511).
If not all of the desired regions of the observation target S have been imaged in step S511, the control section 410 moves the stage 121 by a predetermined amount (step S512). Thereafter, the control section 410 returns to the processing in step S510. The control section 410 repeats the processing in steps S510 to S512 until all of the desired regions of the observation target S are imaged.
If all of the desired regions of the observation target S have been imaged in step S511, the control section 410 corrects the positions of a generated plurality of original image data corresponding to the i-th directional illumination on the basis of the position information and the overlapping region information stored in the storing section 420 (step S513). The control section 410 connects the corrected plurality of original image data (step S514).
Subsequently, the control section 410 determines whether i is 4 (step S515). If i is not 4 in step S515, the control section 410 updates i to i+1 (step S516). Thereafter, the control section 410 returns to the processing in step S510. The control section 410 repeats the processing in steps S510 to S516 until i reaches 4. Consequently, a plurality of original image data corresponding to the respective first to fourth directional illuminations are generated. The plurality of original image data are connected on the basis of the position information and the overlapping region information stored in the storing section 420. If i is 4 in step S515, the control section 410 ends the processing.
In the above explanation, a part of the processing may be performed at another point in time. For example, the processing in steps S506 and S507 may be executed in parallel to steps S508 to S512. The processing in steps S513 and S514 corresponding to the i-th directional illumination may be executed in parallel to steps S510 to S512 corresponding to the (i+1)-th directional illumination. In these cases, it is possible to increase the speed of the connection processing.
Alternatively, the processing in steps S506 and S507 may be executed later than the processing in steps S508 to S512. The processing insteps S513 and S514 corresponding to the i-th directional illumination may be executed later than the processing in steps S510 to S512 corresponding to the (i+1)-th directional illumination.
Subsequently, the control section 410 sets i to 1 (step S524). In this step, i indicates numbers of a plurality of directional illuminations. Thereafter, the control section 410 irradiates the i-th directional illumination on the observation target S with the light projecting section 140 and images the observation target S with the imaging section 132 (step S525). Subsequently, the control section 410 determines whether i is 4 (step S526).
If i is not 4 in step S526, the control section 410 updates i to i+1 (step S527). Thereafter, the control section 410 returns to the processing in step S525. The control section 410 repeats the processing in steps S525 to S527 until i becomes 4. Consequently, pluralities of original image data corresponding to the respective first to fourth directional illuminations are generated. If i is 4 in step S526, the control section 410 determines whether all of the desired regions of the observation target S have been imaged (step S528).
If not all of the desired regions of the observation target S have been imaged in step S528, the control section 410 moves the stage 121 by a predetermined amount (step S529). Thereafter, the control section 410 returns to the processing in step S522. The control section 410 repeats the processing in steps S522 to S528 until all of the desired regions of the observation target S are imaged.
If all of the desired regions of the observation target S have been imaged in step S528, the control section 410 connects a generated plurality of original image data corresponding to the ring illumination (step S530). The control section 410 causes the storing section 420 to store overlapping region information indicating overlapping regions at the time when the original image data adjacent to one another are connected (step S531).
Subsequently, the control section 410 sets i to 1 again (step S532). Thereafter, the control section 410 corrects the positions of a generated plurality of original image data corresponding to the i-th directional illumination on the basis of the position information and the overlapping region information stored in the storing section 420 (step S533). The control section 410 connects the corrected plurality of original image data (step S534). Subsequently, the control section 410 determines whether i is 4 (step S535).
If i is not 4 in step S535, the control section 410 updates i to i+1 (step S536). Thereafter, the control section 410 returns to the processing in step S533. The control section 410 repeats the processing in steps S533 to S536 until i reaches 4. Consequently, pluralities of original image data corresponding to the respective first to forth directional illuminations are connected on the basis of the position information and the overlapping region information stored in the storing section 420. If i is 4 in step S535, the control section 410 ends the processing.
In the above explanation, a part of the processing may be performed at another point in time. For example, a part of the processing in steps S530 to S536 may be executed in parallel to the processing in steps S521 to S529. In this case, it is possible to increase the speed of the connection processing. The processing in step S523 may be executed later than the processing in steps S524 to S527.
In the magnifying observation apparatus 1 according to this embodiment, the plurality of original image data respectively corresponding to the ring illumination and the first to fourth directional illuminations are generated by the plural illumination imaging processing. In the image-for-display generation processing, the imaginary emitting direction of light is designated by the light icon ss1 on the display section 430 on the basis of the operation of the operation section 440 by the user. The image data for display indicating the image SI of the observation target S that should be obtained when it is assumed that the light in the designated emitting direction is irradiated on the observation target S is generated on the basis of the designated emitting direction and the plurality of original image data. The image SI based on the generated image data for display is displayed on the display section 430.
Therefore, by optionally designating the imaginary emitting direction of light, the user can generate, without changing the emitting direction of light actually irradiated on the observation target S, image data for display indicating the image SI at the time when light in an appropriate emitting direction corresponding to the shape and the material of the observation target S is irradiated on the observation target S. Consequently, the user can easily acquire the image SI of the observation target S corresponding to a request of the user.
The image data for display can be generated using an already generated plurality of image data. Therefore, it is unnecessary to perform the imaging of the observation target S again. Therefore, it is possible to reduce a burden on the user.
In the image-for-display generation processing, when the designated emitting direction is not the ring emitting direction or all of the first to fourth emitting directions, combination rates are calculated concerning the plurality of original image data. The plurality of original image data are combined on the basis of the combination rates. Consequently, it is possible to easily generate image data for display corresponding to the designated emitting direction.
Concerning a magnifying observation apparatus according to a second embodiment of the present invention, differences from the magnifying observation apparatus 1 according to the first embodiment are explained.
The lens unit 131 is configured to be capable of holding the light projecting section 160 on the inside. The light projecting section 160 is disposed in the lens unit 131 in a state in which the light projecting section 160 is inclined at approximately 45° with respect to the optical axis A1 of the objective lens 131a such that a reflection surface of the half mirror 161 faces obliquely downward. The light projecting section 160 is optically connected to the light generating section 300 of the processing device 200 by a part of not-shown optical fibers of the fiber unit 201.
The light blocking section 320 of the light generating section 300 includes a plurality of opening patterns respectively corresponding to the regions 140A to 140D of the light projecting section 140 shown in
The light made incident on the light projecting section 160 is reflected by the half mirror 161 to be emitted downward along the optical axis A1 of the objective lens 131a and irradiated on the observation target S. The light emitted from the light projecting section 160 is referred to as coaxial epi-illumination. The light irradiated on the observation target S is reflected upward, transmitted through the half mirror 161 of the light projecting section 160 and the lens unit 131, and guided to the imaging section 132.
With the configuration explained above, the light projecting section 160 is capable of irradiating the light on the observation target S from a position closer to the optical axis A1 of the objective lens 131a than the light projecting section 140. Therefore, the coaxial epi-illumination is bright field illumination emitted in a direction parallel to the optical axis A1 of the objective lens 131a. The ring illumination is dark field illumination irradiated in a direction inclined with respect to the optical axis A1 of the objective lens 131a. By irradiating the coaxial epi-illumination on the observation target S, it is possible to more clearly image unevenness on the surface of the observation target S and a difference of a material. Note that it is also possible to simultaneously irradiate lights on the observation target S from the light projecting sections 140 and 160.
The imaging control section 520 shown in
The imaging section 132 further generates original image data indicating the observation target S at the time when the coaxial epi-illumination is irradiated on the observation target S. The generated original image data corresponding to the coaxial epi-illumination is stored in the storing section 420. Imaging information further indicating presence or absence of execution of the irradiation of the coaxial epi-illumination and imaging conditions such as a light reception time during the irradiation of the coaxial epi-illumination is stored in the storing section 420.
The data generating section 610 shown in
In the plural illumination imaging processing according to this embodiment, the observation target S is imaged using the coaxial epi-illumination after the imaging of the observation target S performed using the ring illumination and the first to fourth directional illuminations. Consequently, a plurality of (in this example, six) original image data respectively corresponding to the ring illumination, the first to fourth directional illuminations, and the coaxial epi-illumination are generated.
When the plural illumination imaging processing is completed, the observation screen 430A is displayed on the display section 430.
The user operates the epi-illumination button b11 using the console 440A shown in
When it is instructed that the original image data corresponding to the coaxial epi-illumination is used, a bar 433c and a slider 433d for designating a combination rate of the original image data corresponding to the coaxial epi-illumination to the other original image data (hereinafter referred to as epi-illumination image rate) are displayed in the sub-display region 433.
The user can designate the epi-illumination image rate by operating the slider 433d. In the example shown in
When the epi-illumination image rate is designated, in the processing in step S209 in
For example, first, the control section 410 calculates combination rates of original image data respectively corresponding to the ring illumination and the first to fourth directional illuminations on the basis of the designated emitting direction. Thereafter, the control section 410 corrects the combination rates of the original image data respectively corresponding to the ring illumination and the first to fourth directional illuminations such that a total of a plurality of combination rates of the ring illumination and the first to fourth directional illuminations is (100−t) % when the designated epi-illumination image rate is t (t is a number of 0 to 100)%.
The plurality of original image data are combined on the basis of the plurality of combination rates calculated as explained above, whereby image data for display is generated. The image SI of the observation target S including components of the original image data corresponding to the coaxial epi-illumination is displayed in the main display region 432.
With the image SI of the original image data corresponding to the coaxial epi-illumination, it is possible to accurately observe the unevenness on the surface of the observation target S and the difference of the material compared with the image SI of the original image data respectively corresponding to the ring illumination and the first to fourth directional illuminations.
Therefore, the user can easily acquire the image SI of the observation target S corresponding to a purpose of observation by operating the epi-illumination button b11 and the slider 433d shown in
Concerning a magnifying observation apparatus according to a third embodiment of the present invention, differences from the magnifying observation apparatus 1 according to the first embodiment are explained.
The setting section 111 of the stand section 110 is configured to be capable of holding the light projecting section 170 on the inside. The light projecting section 170 is disposed in the setting section 111 in a state in which the light projecting section 170 is inclined approximately 45° with respect to the optical axis A1 of the objective lens 131a such that a reflection surface of the mirror 171 faces obliquely upward. Consequently, the light projecting section 170 is opposed to the lens barrel section 130 across the observation target S and the stage 121. The light projecting section 170 is optically connected to the light generating section 300 of the processing device 200 by a part of the not-shown optical fibers of the fiber unit 201.
The light blocking section 320 of the light generating section 300 includes a plurality of opening patterns respectively corresponding to the regions 140A to 140D of the light projecting section 140 shown in
The light made incident on the light projecting section 170 is reflected by the mirror 171 to be emitted upward along the optical axis A1 of the objective lens 131a and irradiated on the observation target S on the stage 121. The light emitted from the light projecting section 170 is referred to as transmission illumination. The light irradiated on the observation target S is transmitted upward, transmitted through the lens unit 131, and guided to the imaging section 132. By irradiating the transmission illumination on the observation target S, it is possible to image the structure on the inside of the observation target S.
The imaging section 132 further generates original image data indicating the observation target S at the time when the transmission illumination is irradiated on the observation target S. The generated original image data corresponding to the transmission illumination is stored in the storing section 420. Imaging information further indicating presence or absence of execution of irradiation of the transmission illumination and imaging conditions such as a light reception time during the irradiation of the transmission illumination is stored in the storing section 420.
The data generating section 610 shown in
The measurement head 100 shown in
In the plural illumination imaging processing according to this embodiment, the observation target S is imaged using the transmission illumination after the imaging of the observation target S performed using the ring illumination and the first to fourth directional illuminations. Consequently, a plurality of (in this example, six) original image data respectively corresponding to the ring illumination, the first to fourth directional illuminations, and the transmission illumination are generated.
When the plural illumination imaging processing is completed, the observation screen 430A is displayed on the display section 430.
The user operates the transmission button b12 using the console 440A shown in
When it is instructed that the original image data corresponding to the transmission illumination is used, a bar 433e and a slider 433f for designating a combination rate of the original image data corresponding to the transmission illumination to the other original image data (hereinafter referred to as transmission image rate) are displayed in the sub-display region 433.
The user can designate the transmission image rate by operating the slider 433f. In the example shown in
When the transmission image rate is designated, in the processing in step S209 in
For example, first, the control section 410 calculates combination rates of original image data respectively corresponding to the ring illumination and the first to fourth directional illuminations on the basis of the designated emitting direction. Thereafter, the control section 410 corrects the combination rates of the original image data respectively corresponding to the ring illumination and the first to fourth directional illuminations such that a total of a plurality of combination rates of the ring illumination and the first to fourth directional illuminations is (100−u) % when the designated transmission image rate is u (u is a number of 0 to 100)%.
The plurality of original image data are combined on the basis of the plurality of combination rates calculated as explained above, whereby image data for display is generated. The image SI of the observation target S including components of the original image data corresponding to the transmission illumination is displayed in the main display region 432. When the observation target S is formed of a material that transmits light, the internal structure of the observation target S clearly appears in the image SI of the original image data corresponding to the transmission illumination.
Therefore, the user can easily acquire the image SI of the observation target S corresponding to a purpose of observation by operating the transmission button b12 and the slider 433f shown in
When the measurement head 100 shown in
(1) In the embodiment, after the plurality of original image data respectively corresponding to the plurality of illuminations are generated, the image data for display is generated on the basis of the plurality of original image data according to the illumination conditions designated by the user. However, the present invention is not limited to this. The illumination conditions may be designated by the user first. In this case, original image data necessary for generating the image data for display that should be generated is determined on the basis of the designated illumination conditions.
Therefore, in this embodiment, only illumination corresponding to the necessary original image data is irradiated on the observation target S. Consequently, only the necessary original image data is generated. With this configuration, the other illuminations are not irradiated on the observation target S. Unnecessary original image data is not generated. Consequently, it is possible to generate the image data for display at high speed.
(2) In the embodiment, the regions 140A to 140D of the light projecting section 140 are desirably disposed rotation-symmetrically around the optical axis A1 of the objective lens 131a. However, the present invention is not limited to this. The regions 140A to 140D of the light projecting section 140 do not have to be disposed rotation-symmetrically around the optical axis A1 of the objective lens 131a.
(3) In the embodiment, when the image data for display is generated by combining the plurality of original image data, one of the plurality of original image data is desirably original image data corresponding to the ring illumination. However, the present invention is not limited to this. It is also possible that the original image data corresponding to the ring illumination is not used for the combination and the image data for display is generated by combining a part or all of the plurality of original image data respectively corresponding to the directional illuminations, the coaxial epi-illumination, and the transmission illumination.
(4) In the embodiment, the image data for display is generated by combining the plurality of original image data. However, the present invention is not limited to this. The image data for display may be generated by selecting one of the generated plurality of original image data. In this configuration, a larger number of original image data are desirably generated. In this case, it is possible to generate more accurate image data for display. Therefore, a larger number of light emission regions may be provided in order to make it possible to generate a larger number of original image data. Light emitting members may be provided to be capable of emitting lights from a larger number of positions. Alternatively, a single light emitting member may be provided to be capable of moving to a plurality of emitting positions.
(5) In the embodiment, the light projecting section 140 has the cylindrical shape. However, the present invention is not limited to this. The light projecting section 140 may have a shape other than the cylindrical shape.
(6) In the embodiment, in the image-for-display generation processing, when the imaginary emitting direction of light is designated by the user anew, the image SI of the observation target S displayed on the display section 430 is switched to the image SI corresponding to the light in the designated emitting direction. However, the present invention is not limited to this.
When the imaginary emitting direction of light is designated by the user anew, the image SI of the observation target S based on the image data for display before the update and the image SI of the observation target S based on the image data for display after the update may be simultaneously displayed on the display section 430. In this case, the user can compare the image SI of the observation target S before the emitting direction is designated and the image SI of the observation target S after the designation. Therefore, it is possible to easily identify an appropriate image SI through the observation of the observation target S while designating the imaginary emitting direction of light. The example in which the images SI before and after the update are simultaneously displayed includes, for example, displaying the images SI before and after the update side by side and superimposing and displaying the images SI before and after the update.
(7) In the embodiment, the emitting-direction designation field 433a for designating the imaginary emitting direction of light and the emitting-direction display field 433b for displaying the imaginary emitting direction of light are displayed on the display section 430 together with the image SI based on the image data for display. However, the present invention is not limited to this. The emitting-direction designation field 433a and the emitting-direction display field 433b may be displayed on another display device separate from the display section 430. For example, by providing a display function in the console 440A of the operation section 440, the emitting-direction designation field 433a and the emitting-direction display field 433b may be displayed on the console 440A.
(8) In the embodiment, the control section 410 of the magnifying observation apparatus 1 executes the plural illumination imaging processing, the image-for-display generation processing, the depth synthesis processing, the DR adjustment processing, and the connection processing. However, the present invention is not limited to this. The control section 410 only has to execute the plural illumination imaging processing and the image-for-display generation processing among the plurality of kinds of processing explained above and does not have to execute a part or all of the depth synthesis processing, the DR adjustment processing, and the connection processing.
An example of correspondence between the constituent elements of the claims and the sections of the embodiments is explained. However, the present invention is not limited to the example explained below.
In the embodiment, the observation target S is an example of the observation target, the stage 121 is an example of the stage, the ring illumination and the first to fourth directional illuminations are examples of the lights in the plurality of emitting directions, the light projecting sections 140, 160, and 170, the fiber units 201 and 204, and the light generating section 300 are examples of the light projecting device, the imaging section 132 is an example of the imaging section, the display section 430 and the operation section 440 are examples of the designating section, the data generating section 610 is an example of the data generating section, the display section 430 is an example of the display section, and the magnifying observation apparatus 1 is an example of the magnifying observation apparatus.
The light icon ss1 and the emitting position image ss4 are examples of the first indicator, the display section 430 is an example of the indicator display section, the operation section 440 is an example of the operation section, the target partial image sp and the reference point image ss2 on the target position image ss0 are an example of the second indicator, and the hemispherical image ss3 is an example of the third indicator.
Further, the function display region 431 in the display section 430 is an example of the first display region, the main display region 432 in the display section 430 is an example of the second display region, the objective lens 131a is an example of the objective lens, the regions 140A to 140D of the light projecting section 140 are examples of the plurality of light emission regions, the light projecting section 140 is an example of the first light projecting section, and the light blocking section 320 is an example of the switching section.
The coaxial epi-illumination is an example of the first light, the light projecting section 160 is an example of the second light projecting section, the transmission illumination is an example of the second light, the light projecting section 170 is an example of the third light projecting section, the main display region 432 is an example of the main display region, the emitting-direction designation field 433a is an example of the sub-display region, the target partial image sp displayed in the emitting-direction designation field 433a is an example of the position image, and the light icon ss1 displayed in the main display region 432 and the emitting-direction designation region 433a is an example of the emitting position indicator.
As the constituent elements of the claims, other various elements including the configurations or the functions described in the claims can also be used.
The present invention can be effectively used in various magnifying observation apparatuses.
Number | Date | Country | Kind |
---|---|---|---|
2016-144939 | Jul 2016 | JP | national |