This application is based upon and claims the benefit of priority from Japanese Patent Applications No. 2005-274331, filed Sep. 21, 2005; and No. 2006-208875, filed Jul. 31, 2006, the entire contents of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to an observation apparatus that captures an image of a sample for observation.
2. Description of the Related Art
One conventional technique of microscopy of a sample, such as a living cell, includes capturing an image of the sample at time intervals (hereinafter such a manner of image-taking will be referred to as time-lapse imaging) to generate an observation image; reproducing a series of observation images after the time-lapse imaging is finished; and observing a moving picture to check a morphological change in the sample over time. Such a conventional technique is considered to be highly effective for an observation of temporal change in the sample.
In recent years, the time-lapse imaging is sometimes performed at plural imaging positions, for example, when living cells cultured under the same condition are tested with plural types of agents for confirmation of the effect of the agents, or when temporal changes of different cells are observed under the same environment.
When the time-lapse imaging is performed at plural imaging positions (this manner of image-taking will be hereinafter referred to as multipoint time-lapse imaging), the plural imaging positions are not always located in a viewing field of one microscope. Even if the imaging positions reside on one particular living cell under the observation, one or more imaging positions are often located outside the viewing field of the microscope. In addition, plural imaging positions often reside respectively on different living cells.
One conventional imaging technique to accommodate the inconveniences described above is described in Japanese Patent Application Laid-Open (JP-A) No. 2002-277754 (KOKAI). A structure and a method described in JP-A No. 2002-277754 (KOKAI) allow for the multipoint time-lapse imaging. The described method includes steps of placing a sample containing living cells on a stage whose positioning is electrically controllable along X, Y, and Z axes, and previously setting positional coordinates of plural imaging positions, exposure of an imaging element at the imaging positions, a time interval of the time-lapse imaging for each imaging position, and a number of images to be captured.
The sample is illuminated by illumination light during the time-lapse imaging. The irradiation of the illumination light causes discoloration and damage of the sample. Hence, it is desirable that information on the irradiation of the illumination light be available to an operator when the operator evaluates the observation image after the time-lapse imaging is finished, in other words, it is desirable that the operator can know an accumulated amount of light irradiation on the sample of the time-lapse imaging. In other words, it is desirable to provide the information on illumination condition together with the time-lapse observation image.
An observation apparatus according to one aspect of the present invention includes an illuminating unit that illuminates a sample; an imaging unit that captures an image of the sample to generate an observation image; a storage unit that stores the observation image in association with an illumination condition of the illuminating unit at generation of the observation image by the imaging unit; an imaging controller that controls the imaging unit to capture the image of the sample to generate the observation image and stores the observation image in the storage unit; and an illumination controller that controls the illuminating unit to illuminate the sample, and stores the illumination condition in the storage unit every time the imaging unit captures the image of the sample.
The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
Exemplary embodiments of the present invention will be described below with reference to the accompanying drawings.
The observation apparatus includes a microscope 10 for observation of a sample such as a living cell. The microscope 10 includes a microscope body 11, an intermediate lens barrel 21 arranged over the microscope body 11, and an eyepiece lens barrel 16 arranged on the intermediate lens barrel 21.
The microscope body 11 has an electromotive stage 12 which is movable in a three-dimensional direction (XYZ directions), and a revolver 14 which can hold plural objective lenses 13. Generally, the objective lenses 13 with different magnifications are attached to the revolver 14, and one of the attached objective lenses 13 is arranged on an optical path of the microscope 10. A sample S is placed on the electromotive stage 12. The sample S contains plural living cells that rest in a lower portion of a transparent container filled with culture solution, for example. The electromotive stage 12 has plural built-in motors M, and is capable of moving the sample S placed thereon in a three-dimensional manner relative to the objective lens 13.
A transmitting illumination light source 31 is attached to the microscope body 11. The microscope body 11 has a field shutter (FS) 32, a neutral density (ND) filter 33, and a mirror 34. The transmitting illumination light source 31, the field shutter 32, the ND filter 33, and the mirror 34 together form a transmitting illumination optical system which serves to illuminate the sample S from below.
An incident-light illumination light source 22 is attached to the intermediate lens barrel 21. The intermediate lens barrel 21 has a field shutter 24. Further, necessary optical elements are arranged inside the intermediate lens barrel 21 as appropriate for various types of microscope observations, such as polarization, phase difference, Nomarski, and fluorescent microscope observations. Such optical elements are, for example, various filters and polarizing element, and denoted collectively by reference character 23. Further, a variable power lens 15 is arranged as appropriate inside the microscope body 11 so that an observation magnification can be easily changed. The incident-light illumination light source 22, the optical element 23, the variable power lens 15, and the objective lens 13 together form an incident-light illumination optical system that serves to illuminate the sample S from above.
To the eyepiece lens barrel 16, an eyepiece 17 which allows an observation of the sample S with a naked eye, and an imaging unit 18 which serves to capture the image of the sample S and to generate an observation image are attached. The imaging unit 18 may include a charge coupled device (CCD), for example, though not limited thereto. The imaging unit 18 captures the image of the sample S through an observation optical system that includes the objective lens 13 and the variable power lens 15. In other words, the imaging unit 18 captures the image of the sample S by capturing an observation image formed by the observation optical system for the sample S.
The microscope further includes a stage driver 41, a revolver driver 42, an illumination controller 43, an optical element controller 44, and an FS controller 45.
The stage driver 41 drives the electromotive stage 12 in a horizontal direction (XY direction drive) and in a vertical direction (Z direction drive) in order to change an area position of an imaging area of the sample S relative to the imaging unit 18. Here, the term “area position” means a position of the imaging area as indicated by XYZ coordinate system and located by the electromotive stage 12.
The revolver driver 42 rotates the revolver 14 to arrange the objective lens 13 of a desired magnification on the optical path. Thus, the revolver driver 42 and the revolver 14 function as a power changing mechanism that changes an observation magnification adopted by the observation optical system to form the observation image.
The illumination controller 43 serves to control various types of lighting necessary for the imaging. For example, the illumination controller 43 turns on and turns off the incident-light illumination light source 22 that illuminates the sample S from above and the transmitting illumination light source 31 that illuminates the sample S from below, while adjusting the amount of light of the light sources 22 and 31.
The optical element controller 44 arranges the optical element 23 on the optical path, retracts the optical element 23 from the optical path, and exchanges the variable power lens 15. The function of exchanging the power variable lens 15 allows the optical element controller 44 to function as a power changing mechanism that changes the observation magnification of the observation image similarly to the revolver driver 42 and the revolver 14.
The FS controller 45 controls the field shutters 24 and 32 so that the transmitting illumination optical system and the incident-light illumination optical system illuminate only an imaging region set for the imaging by the imaging unit 18.
The observation apparatus further includes a control unit 50, a monitor 55 that displays an image of a living cell and various pieces of information, an input device 56, and a storage unit 58 that stores the observation image, the XY coordinates of the electromotive stage 12, imaging conditions (including the illumination condition), and the like. The control unit 50 includes an imaging controller 51, a microscope controller 52, an operation information management unit 53, and an imaging information management unit 54. The imaging controller 51 serves as an imaging controller. The microscope controller 52 serves as an illumination controller, a movement controller, and a power change controller. The imaging information management unit 54 serves as a display controller.
The control unit 50 includes a central processing unit (CPU), a random access memory (RAM), and the like. The input device 56 includes, for example, a pointing device such as a mouse, and a keyboard. The storage unit 58 is, for example, a hard disk. The storage unit 58 stores a program 59 and an imaging information database 60. The program 59 includes, for example, a program for operating the CPU as the imaging controller 51, the microscope controller 52, the operation information management unit 53, and the imaging information management unit 54, and a program for controlling the imaging unit 18, the imaging controller 51, and the microscope controller 52 to perform a time-lapse imaging of a previously designated section. The program used here operates based on Microsoft Windows® as basic software, for example, and various commands are given via the input device 56.
The microscope controller 52 controls the stage driver 41, the revolver driver 42, the illumination controller 43, the optical element controller 44, and the FS controller 45, and makes these units perform necessary operations for the imaging. The imaging controller 51 performs various controls of the imaging unit 18 according to a previously set imaging condition. Specifically, the imaging controller 51 performs a control to make the imaging unit 18 capture an image of the sample S to generate the observation image, and to store the observation image in the imaging information database 60 inside the storage unit 58. Here, the previously set imaging condition is a condition related with a time of exposure, gain, or the like, and is appropriately set and changed for each sample S.
The operation information management unit 53 cooperates with the monitor 55 and the input device 56, and configures various graphical user interfaces (GUI). The GUI is, for example, a GUI for giving a command to the imaging unit 18 to capture an image of the sample S, a GUI for setting an area position as a target of the time-lapse imaging, a GUI for providing information corresponding to the observation image generated by the imaging unit 18.
The microscope controller 52 performs a control based on a command input from the input device 56 via the GUI displayed on the monitor 55 by the operation information management unit 53. The microscope controller 52 controls the stage driver 41 and the electromotive stage 12 to shift the imaging area in XY direction and Z direction, and controls the revolver driver 42, the illumination controller 43, the optical element controller 44, and the FS controller 45 for illumination, for example.
The electromotive stage 12 has a mechanical origin for each of the X, Y, and Z directions. The microscope controller 52 internally manages a shift amount instructed to the stage driver 41 based on the mechanical origins. Hence, the microscope controller 52 can recognize a current positional coordinate of the electromotive stage 12. In other words, the microscope controller 52 has a function of detecting the position of the electromotive stage 12 relative to the optical axis of the objective lens 13, and outputs the current positional coordinates (X, Y, Z) of the electromotive stage 12 as a current position of an imaging area. As an alternative structure, a separate position detector may be provided for detecting the current position of the electromotive stage 12. Then, the position detector may directly recognize the positional coordinates of the electromotive stage 12.
A procedure of observation using the observation apparatus according to the embodiment will be described below.
First, the sample S including the living cell is placed on the electromotive stage 12. Then, the electromotive stage 12 moves the sample S so as to shift the imaging area within the XY plane relative to the imaging unit 18 until a target living cell is located, in order to select an appropriate cell as the observation target. The electromotive stage 12 shifts the imaging area within an imageable region (region of 10 mm×10 mm, for example) of the sample S by moving the sample S to the left and the right repetitiously while gradually shifting the sample S upwards similarly to the manner of raster scanning. When the electromotive stage 12 locates an appropriate cell, the imaging unit 18 captures a still image thereof.
At the image capturing, the observation apparatus receives area designating information from the input device 56. The area designating information designates an imaging area covering the appropriate cell. Every time the area designating information is supplied from the input device 56, the microscope controller 52 moves the sample S until the imaging area designated by the area designating information comes into the imaging region of the imaging unit 18 and temporarily stops the sample S at the position. The imaging controller 51 makes the imaging unit 18 capture the image of the sample S whenever the sample S is temporarily stopped to generate the observation image, and stores the observation image in the imaging information database 60.
When the imaging areas a to f including desirable observation targets are extracted from the imageable region R and stored in the above described manner, a screening (cell locating) operation finishes.
Thereafter, an imaging area including a particularly suitable cell is selected from the extracted imaging areas a to f. Generally, it is desirable to use an isolated cell for the observation of the living cell. Therefore, the imaging areas a, c, and e, for example, are selected as the observation targets of the time-lapse imaging. Though an image of the imaging area f also includes an isolated cell, the imaging area f is not selected as the observation target of the time-lapse imaging. The imaging information database 60 stores the XY coordinates of the electromotive stage 12 as indications of the area positions of the imaging areas a, c, and e, respectively, as described above. When the imaging areas a, c, and e are selected as observation targets for the time-lapse imaging, the XY coordinates corresponding to the imaging areas a, c, and e are stored as time-lapse imaging positions that indicate positions of observation targets for the time-lapse imaging.
Every time a previously set time interval for the time-lapse imaging passes, the microscope controller 52 drives the electromotive stage 12 via the stage driver 41 to sequentially place the imaging areas a, c, and e in the imaging region of the imaging unit 18, based on the XY coordinates of the electromotive stage 12 corresponding to the area positions of the imaging areas a, c, and e as stored in the imaging information database 60. Every time the imaging areas a, c, and e are sequentially placed within the imaging region, the imaging controller 51 gives an imaging command to the imaging unit 18. In response to the imaging command, the imaging unit 18 sequentially captures images of the imaging areas a, c, and e via the objective lens 13 to generate observation images thereof. The generated observation images are stored in the imaging information database 60. Further, the microscope controller 52 stores the illumination condition of one of the transmitting illumination optical system and the incident-light illumination optical system, and the observation magnification of the observation optical system in the imaging information database 60. The storage unit 58 associates the XY coordinates indicating the area position of the imaging area, the illumination condition, and the observation magnification with each other in the imaging information database 60 corresponding to each of the observation images obtained by the time-lapse imaging.
The illumination condition stored in the imaging information database 60 is, for example: elapsed time since the microscope controller 52 starts illumination of the sample S using one of the transmitting illumination optical system and the incident-light illumination optical system; irradiation time during which the transmitting illumination optical system or the incident-light illumination optical system illuminates the sample S every time the imaging unit 18 captures the image of the sample S; irradiation intensity of the illumination light irradiated on the sample S by the transmitting illumination optical system or the incident-light illumination optical system during the irradiation time; and wavelength of the illumination light. More specifically, the elapsed time corresponds to time passed since the screening operation is started until the microscope controller 52 turns on one of the incident-light illumination light source 22 and the transmitting illumination light source 31, and the irradiation time corresponds to time the incident-light illumination light source 22 or the transmitting illumination light source 31 remains on at each image-taking by the imaging unit 18.
The time-lapse imaging is performed at high observation magnification (40×) every one hour starting from time 1:00, for example. An imaging at a low magnification (10×) is also performed once every four hours to check the influence on surrounding cells.
After the time-lapse imaging is finished, the imaging information management unit 54 calculates the accumulated amount of illumination light for each observation image based on the illumination condition stored in the imaging information database 60. The imaging information management unit 54 can display the information indicating the accumulated amount of illumination light superposed on the observation image on the monitor 55. Specifically, the imaging information management unit 54 divides an image area, which corresponds to the imageable region R, into two-dimensional blocks to display the image area as a coordinate table. The imaging information management unit 54 displays respective observation images corresponding to the imaging areas a, c, and e on the coordinate table. Further, the imaging information management unit 54 can convert the accumulated amount of illumination light irradiated on each of the imaging area corresponding to the observation image into display brightness, i.e., brightness of the image. Then, the imaging information management unit 54 can display the observation image on the monitor 55 in the obtained brightness. In
The operation information management unit 53 displays the GUI on the monitor 55. An operator performs a predetermined click manipulation with the mouse (for example, double clicks the mouse button) on a specific block or on a specific observation image, thereby inputting designating information to designate an area position. On receiving the designating information that designates the area position from the input device 56, the imaging information management unit 54 can display plural illumination conditions stored in the imaging information database 60 in a temporal order in association with the designated area position. Specifically, as shown in
Further, when the operator similarly performs a predetermined click manipulation with the mouse (for example, selects a menu item by right clicking) on a specific block or a specific observation image to input designating information that designates an area position from the input device 56, the imaging information management unit 54 can calculate temporal changes in the accumulated amount of illumination light on the imaging area corresponding to the designated area position based on the plural illumination conditions which are stored in a temporal order in the imaging information database 60 in association with the designated area position, and display information indicating the temporal changes on the monitor 55. Specifically, as shown in
Further, when the operator performs a predetermined click manipulation with the mouse (for example, selects a menu item by right clicking) on the GUI displayed on the monitor 55 by the operation information management unit 53, the imaging information management unit 54 can, in response thereto, display the observation images corresponding to the imaging areas b, d, and f, for which the accumulated amount of illumination light is small, on the monitor 55 in addition to the time-lapse images corresponding to the imaging areas a, c, and e as shown in
Further, when the operator performs a predetermined click manipulation with the mouse (for example double clicks the mouse button) on the observation image displayed on the monitor 55, to input image selecting information to select plural observation images from the input device 56, the imaging information management unit 54, as shown in
Further, as shown in
Further, when the operator performs a predetermined click manipulation with the mouse (for example, double clicks the mouse button) on a specific block or a specific observation image displayed on the monitor 55 to input designating information to designate an area position from the input device 56, the imaging information management unit 54, as shown in
As can be seen from the foregoing, the observation apparatus according to the embodiment can display various types of information such as the illumination condition, which is stored in association with the observation image, in addition to the observation image obtained by time-lapse imaging. In brief, the observation apparatus of the embodiment can display various types of useful information for the evaluation of the observation image, for example, the illumination condition in association with the observation image.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2005-274331 | Sep 2005 | JP | national |
2006-208875 | Jul 2006 | JP | national |