1. Field of the Invention
The present invention relates to an image obtaining method and an image capturing apparatus for obtaining a deep portion image by directing light having two different wavelengths to obtain two types of images and performing subtraction between the two images.
2. Description of the Related Art
Endoscope systems for observing tissues of body cavities are widely known and an electronic endoscope system that captures an ordinary image of an observation area in a body cavity by directing white light to the observation area and displaying the captured ordinary image on a monitor screen is widely used.
Further, as one of such endoscope systems, a system that obtains a fluorescence image of a blood vessel or a lymphatic vessel by administering, for example, indocyanine green into a body in advance and detecting ICG fluorescence in the blood vessel or lymphatic vessel by directing excitation light to the observation area is known as described, for example, in U.S. Pat. No. 6,804,549 and Japanese Unexamined Patent Publication No. 2007-244746.
Further, U.S. Pat. No. 7,589,839 proposes a method of obtaining a plurality of fluorescence images using a plurality of fluorescent materials.
For example, the blood vessel observation using the ICG described above allows observation of a blood vessel located in a deep layer in the fluorescence image, since near infrared light used as the excitation light has high penetration into a living body. The fluorescence image, however, includes not only the fluorescence image of the blood vessel in the deep layer but also a fluorescence image of a blood vessel in a surface layer, so that the image information of the blood vessel in the surface layer is unnecessary information (artifact) when only the image of the blood vessel in the deep layer is desired to be observed.
The present invention has been developed in view of the circumstances described above, and it is an object of the present invention to provide an image obtaining method and image capturing apparatus capable of obtaining, for example, a deep portion image that allows only an image of blood vessel located in a deep layer to be observed appropriately.
An image obtaining method of the present invention is a method including the steps of:
An image obtaining method of the present invention is a method including the steps of:
An image obtaining method of the present invention is a method including the steps of:
An image capturing apparatus of the present invention is an apparatus including:
An image capturing apparatus of the present invention is an apparatus including:
In the image capturing apparatus of the present invention described above, near infrared light may be used as the first excitation light.
Further, the light emission unit may be a unit that emits the first excitation light and the second excitation light at the same time, and the imaging unit may be a unit that captures the first fluorescence image and the second fluorescence image at the same time.
An image capturing apparatus of the present invention is an apparatus including:
In the image capturing apparatus of the present invention described above, near infrared light may be used as the excitation light.
Further, the light emission unit may be a unit that emits the excitation light and the narrowband light at the same time, and the imaging unit may be a unit that captures the fluorescence image and the narrowband image at the same time.
According to the image obtaining method and image capturing apparatus of the present invention, a first image captured by directing light having a first wavelength to an observation area and receiving light emitted from the observation area, and a second image captured by directing light having a second wavelength shorter than the first wavelength to the observation area and receiving light emitted from the observation area are obtained, and a deep portion image of the observation area is obtained by subtracting the second image from the first image. This allows, for example, subtraction of a second image that includes a blood vessel located only in a surface layer from a first image that includes blood vessels located in the surface layer and a deep layer, whereby a deep portion image that includes a blood vessel located in the deep layer may be obtained.
Hereinafter, a rigid endoscope system that employs a first embodiment of the image obtaining method and image capturing apparatus of the present invention will be described with reference to the accompanying drawings.
As shown in
As shown in
Body cavity insertion section 30 and imaging unit 20 are detachably connected, as shown in
Connection member 30a is provided at first end 30X of body cavity insertion section 30 (insertion member 30b), and imaging unit 20 and body cavity insertion section 30 are detachably connected by fitting connection member 30a into, for example, aperture 20a formed in imaging unit 20.
Insertion member 30b is a member to be inserted into a body cavity when imaging is performed in the body cavity. Insertion member 30b is formed of a rigid material and has, for example, a cylindrical shape with a diameter of about 5 mm. Insertion member 30b accommodates inside thereof a group of lenses for forming an image of an observation area, and an ordinary image and a fluorescence image of the observation area inputted from second end 30Y are inputted, through the group of lenses, to imaging unit 20 on the side of first end 30X.
Cable connection port 30c is provided on the side surface of insertion member 30b and an optical cable LC is mechanically connected to the port. This causes light source unit 2 and insertion member 30b to be optically connected through the optical cable LC.
As shown in
Further, blue light output lens 30f for outputting blue light and near infrared light output lens 30e for outputting near infrared light are provided symmetrically with respect to imaging lens 30d at second end 30Y of body cavity insertion section 30.
Tubular sleeve member 53 is provided so as to cover the periphery of fluorescent body 52, and ferrule 54 for holding multimode optical fiber 51 as the central axis is inserted in sleeve member 53. Further, flexible sleeve 55 is inserted between sleeve member 53 and multimode optical fiber 51 extending from the proximal side (opposite to the distal side) of ferrule 54 to cover the jacket of the fiber.
Blue light projection unit 60 includes multimode optical fiber 61 for guiding the blue light and space 62 is provided between multimode optical fiber 61 and blue light output lens 30f. Also blue light projection unit 60 is provided with tubular sleeve member 63 covering the periphery of space 62, in addition to ferrule 64 and flexible sleeve 65, as in white light projection unit 50.
Then, inside of body cavity insertion section 30, two white light projection units 50 are provided symmetrically with respect to imaging lens 30d, and blue light projection unit 60 and the near infrared light projection unit are provided symmetrically with respect to imaging lens 30d. The near infrared light projection unit has an identical structure to that of the blue light projection unit other than that the near infrared light is guided through the multimode optical fiber. Note that the dotted circle in each output lens in
As for the multimode optical fiber used in each light projection unit, for example, a thin optical fiber with a core diameter of 105 μm, a clad diameter of 125 μm, and an overall diameter, including a protective outer jacket, of 0.3 mm to 0.5 mm may be used.
Each spectrum of light outputted from each light projection and spectra of fluorescence and reflection light emitted/reflected from an observation area irradiated with the light outputted from each light source are shown in
The term “white light” as used herein is not strictly limited to light having all wavelength components of visible light and may include any light as long as it includes light in a specific wavelength range, for example, primary light of R (red), G (green), or B (blue). Thus, in a broad sense, the white light may include, for example, light having wavelength components from green to red, light having wavelength components from blue to green, and the like. Although white light projection unit 50 emits the blue light spectrum S1 and visible light spectrum S2 shown in
The first imaging system includes dichroic prism 21 that reflects the ICG fluorescence image emitted from the observation area in a right angle direction, excitation light cut filter 22 that transmits the ICG fluorescence image reflected by dichroic prism 21 and cuts the near infrared excitation light reflected by dichroic prism 21, first image forming system 23 that forms the ICG fluorescence image transmitted through excitation light cut filter 22, and first high sensitivity image sensor 24 that takes the ICG fluorescence image formed by first image forming optical system 23.
The second imaging system includes dichroic prism 21 that transmits the fluorescein fluorescence image emitted from the observation area, second image forming system 25 that forms the fluorescein fluorescence image transmitted through dichroic prism 21, color separation prism 26 that transmits the fluorescein fluorescence image formed by second image forming system 25, and second high sensitivity image sensor 28 that takes the fluorescein fluorescence image transmitted through color separation prism 26.
The third imaging system includes dichroic prism 21 that transmits an ordinary image based on reflection light (visible light) reflected from the observation area irradiated with the white light, second image forming system 25 that forms the ordinary image transmitted through dichroic prism 21, color separation prism 26 that separates the ordinary image formed by second image forming system 25 into R (red), G (green), and B (blue) wavelength ranges, third high sensitivity image sensor 27 that images the red light separated by color separation prism 26, second high sensitivity image sensor 28 that images the green light separated by color separation prism 26, and fourth high sensitivity image sensor 29 that images the blue light separated by color separation prism 26.
Color separation prism 26 doubles as an excitation light cut filter since it separates the blue excitation light on the side of fourth high sensitivity image sensor 29 when the fluorescein fluorescence image is captured.
Now, referring to
Imaging unit 20 further includes imaging control unit 20b. Imaging control unit 20b is a unit that performs CDS/AGC (correlated double sampling/automatic gain control) and A/D conversion on image signals outputted from high sensitivity image sensors 24 and 27 to 29, and outputs the resultant image signals to image processing unit 3 through cable 5 (
As shown in
Ordinary image input controller 31 and fluorescence image input controller 32 are each provided with a line buffer having a predetermined capacity and temporarily stores an ordinary image signal formed of image signals of RGB components with respect to one frame, or an ICG fluorescence image signal and an fluorescein fluorescence image signal outputted from imaging control unit 27 of imaging unit 20. Then, the ordinary image signal stored in ordinary image input controller 31 and the fluorescence image signals stored in fluorescence image input controller 32 are stored in memory 34 via the bus.
Image processing section 33 receives the ordinary image signal and fluorescence image signal for one frame read out from memory 34, performs predetermined processing on these image signals, and outputs the resultant image signals to the bus.
As shown in
Video output section 35 receives the ordinary image signal, fluorescence image signal, and composite image signal outputted from image processing section 33 via the bus, generates a display control signal by performing predetermine processing on the received signals, and outputs the display control signal to monitor 4.
Operation section 36 receives input from the operator, such as various types of operation instructions and control parameters. TG 37 outputs drive pulse signals for driving high sensitivity image sensors 24 and 27 to 29 of imaging unit 20, and LD drivers 45, 48 of light source unit 2, to be described later. CPU 36 performs overall control of the system.
As shown in
Light source unit 2 further includes near infrared LD light source 46 that emits 750 to 790 run near infrared light, condenser lens 47 that condenses the near infrared light and inputs the condensed near infrared light to the input end of optical cable LC4, and LD driver 48 that drives near infrared LD light source 46.
In the present embodiment, near infrared light and blue light are used as the two types of excitation light, but excitation light having other wavelengths may also be used as the two types of excitation light as long as the wavelength of either one of them is shorter than that of the other and the excitation light is determined appropriately according to the type of fluorochrome administered to the observation area or the type of living tissue for causing autofluorescence.
Light source 2 is optically coupled to rigid endoscope device 10 through optical cable LC, in which optical cables LC1, LC2 are optically coupled to multimode optical fibers 51 of white light projection unit 50, optical cable LC3 is optically coupled to multimode optical fiber 61 of blue light projection unit 60, and optical cable LC4 is optically coupled to the multimode optical fiber of the near infrared light projection unit.
An operation of the rigid endoscope system of the first embodiment will now be described.
Before going into detailed description of the system operation, the principle of detection of a deep portion blood vessel image to be obtained in the present embodiment will be described using a schematic drawing. In the present embodiment, a deep portion blood vessel located in a deep layer of 1 to 3 mm deep from the body surface is obtained, as shown in
Consequently, in the rigid endoscope system of the present embodiment, a deep portion blood vessel image is obtained by subtracting the fluorescein fluorescence image from the ICG fluorescence image, as illustrated in
Now, a specific operation of the rigid endoscope system of the present invention will be described.
First, body cavity insertion section 30 with the optical cable LC attached thereto and cable 5 are connected to imaging unit 20 and power is applied to light source unit 2, imaging unit 20, and image processing unit 3 to activate them.
Then, body cavity insertion section 30 is inserted into a body cavity by the operator and the tip of body cavity insertion section 30 is placed adjacent to an observation area. Here, it is assumed that ICG and fluorescein have already been administered to the observation area.
Here, an operation of the system for capturing an ICG fluorescence image and an ordinary image will be described first. When capturing an ICG fluorescence image and an ordinary image, blue light emitted from blue LD light source 40 of light source unit 2 is inputted, among optical cables LC1 to LC2, only to LC1 and LC2 through condenser lens 41, optical fiber switch 42, and optical fiber splitter 43. Then, the blue light is guided through optical cables LC1 and LC2 and inputted to body cavity insertion section 30, and further guided through multimode optical fibers 51 of white light projection unit 50 in body cavity insertion section 30. Thereafter, a portion of the blue light outputted from the output end of each multimode optical fiber 51 is transmitted through fluorescent body 52 and directed to the observation area, while the remaining blue light other than the portion is subjected to wavelength conversion to green to yellow visible light by fluorescent body 52 and directed to the observation area. That is, the observation area is irradiated with white light formed of the blue light and green to yellow visible light.
In the mean time, near infrared light emitted from near infrared LD light source 46 of light source unit 2 is inputted to body cavity insertion section 30 through condenser lens 47 and optical cable LC4. Then, the near infrared light is guided through the multimode optical fiber of the near infrared light projection unit in body cavity insertion section 30 and directed to the observation area simultaneously with the white light.
Then, an ordinary image based on reflection light reflected from the observation area irradiated with the white light and an ICG fluorescence image based on ICG fluorescence emitted from the observation area irradiated with the near infrared light are captured simultaneously.
More specifically, an ordinary image is captured in the following manner. Reflection light reflected from the observation area irradiated with the white light is inputted to insertion member 30b from imaging lens 30d at the tip 30Y of insertion member 30b, then guided by the group of lenses inside of the insertion member 30b, and outputted to imaging unit 20.
The reflection light inputted to imaging unit 20 is transmitted through dichroic prism 21 and second image forming system 25, then separated into R, G, and B wavelength ranges by color separation prism 26, and the red light is imaged by third high sensitivity image sensor 27, the green light is imaged by second high sensitivity image sensor 28, and the blue light is imaged by fourth high sensitivity image sensor 29.
Then, R, G, and B image signals outputted from second to fourth high sensitivity image sensors 27 to 29 respectively are subjected to CDS/AGC (correlated double sampling/automatic gain control) and A/D conversion in imaging control unit 27, and outputted to image processing unit 3 through cable 5.
In the mean time, the ICG fluorescence image is captured in the following manner. The ICG fluorescence image emitted from the observation area irradiated with the blue excitation light is inputted to insertion member 30b from imaging lens 30d at the tip 30Y of insertion member 30b, then guided by the group of lenses inside of the insertion member 30b, and outputted to imaging unit 20.
The ICG fluorescence image inputted to imaging unit 20 is reflected in a right angle direction by dichroic prism 21, then passed through excitation light cut filter 22, formed on the imaging surface of first high sensitivity image sensor 24 by first image forming system 23, and imaged by first high sensitivity image sensor 24. The ICG fluorescence image signal outputted from first high sensitivity image sensor 24 is subjected to CDS/AGC (correlated double sampling/automatic gain control) and A/D conversion in imaging control unit 27, and outputted to image processing unit 3 through cable 5.
Next, an operation of the system for capturing a fluorescein fluorescent image will be described.
A fluorescein fluorescent image is captured in the following manner. When capturing a fluorescein fluorescent image, blue light emitted from blue LD light source 40 of light source unit 2 is inputted, among optical cables LC1 to LC2, only to LC3 through condenser lens 41 and optical fiber switch 42. Then, the blue light is guided through optical cable LC3 and inputted to body cavity insertion section 30, and further guided through multimode optical fiber 61 of blue light projection unit 60 in body cavity insertion section 30. Thereafter, the blue light outputted from the output end of multimode optical fiber 61 is passed through space 62 and directed to the observation area.
Then, a fluorescein fluorescent image emitted from the observation area irradiated with the blue light is inputted to insertion member 30b from imaging lens 30d at the tip 30Y of insertion member 30b, then guided by the group of lenses inside of the insertion member 30b, and outputted to imaging unit 20.
The fluorescein fluorescent image inputted to imaging unit 20 is transmitted through dichroic prism 21, second image forming system 25, and color separation prism 26, and imaged by second high sensitivity image sensor 28.
The Fluorescein fluorescence image signal outputted from second high sensitivity image sensor 28 is subjected to CDS/AGC (correlated double sampling/automatic gain control) and A/D conversion in imaging control unit 27, and outputted to image processing unit 3 through cable 5.
Now, referring to A to E of
A of
In the timing charts of R, G, and B image signals shown in A to C of
As the ordinary image and fluorescein fluorescence image have the same G color component and can not be imaged at the same time, they are imaged at different timing as shown in A to C and D of
Note that blue LD light source 40 and near infrared LD light source 46 in light source unit 2 are drive controlled according to the timing charts of A to E of
Next, an operation of the system for displaying an ordinary image, a fluorescence image, and a composite image based on the ordinary image signal formed of R, G, and B image signals, ICG fluorescence image signal, and fluorescein fluorescence image signal obtained by imaging unit 20 will be described with reference to
An operation for displaying the ordinary image and ICG fluorescence image will be described first. The ordinary image signal formed of R, G, and B image signals inputted to image processing unit 3 is temporarily stored in ordinary image input controller 31 and then stored in memory 34 (
Video output section 35 generates a display control signal by performing predetermined processing on the inputted ordinary image signal and outputs the display control signal to monitor 4. Monitor 4, in turn, displays an ordinary image based on the inputted display control signal (
The ICG fluorescence image signal inputted to image processing unit 3 is temporarily stored in fluorescence image input controller 32 and then stored in memory 34 (
Video output section 35 generates a display control signal by performing predetermined processing on the inputted ICG fluorescence image signal and outputs the display control signal to monitor 4. Monitor 4, in turn, displays an ICG fluorescence image based on the inputted display control signal (
Next, an operation of the system for generating a deep portion blood vessel image based on the ICG fluorescence image signal and fluorescein fluorescence image, and displaying a composite image combining the deep portion blood vessel image and ordinary image will be described.
The fluorescein fluorescence image signal inputted to image processing unit 3 is temporarily stored in fluorescence image input controller 32 and then stored in memory 34 (
Then, the fluorescein fluorescence image signal and ICG fluorescence image signal stored in memory 34 are inputted to blood vessel extraction unit 33c of image processing section 33. Then, in blood vessel extraction unit 33c, blood vessel extraction processing is performed on each image signal (
The blood vessel extraction may be implemented by performing line segment extraction. In the present embodiment, the line segment extraction is implemented by performing edge detection and removing an isolated point from the edge detected by the edge detection. Edge detection methods include, for example, Canny method using first derivation. A flowchart for explaining the line segment extraction using the Canny edge detection is shown in
As shown in
Thereafter, with respect to each of ICG fluorescence image signal and fluorescein fluorescence image signal subjected to the filtering, the magnitude and direction of the density gradient are calculated (
Then, the local maximum point is compared to a predetermined threshold value and a local maximum point with a value greater than or equal to the threshold value is detected as an edge (
The edge detection algorithm is not limited to that described above and the edge detection may also be performed using a LOG (Laplace of Gaussian) filter that combines Gaussian filtering for noise reduction with a Laplacian filter for edge extraction through secondary differentiation.
In the present embodiment, a blood vessel is extracted by line segment extraction using edge detection, but the method of blood vessel extraction is not limited to this and any method may be employed as long as it is designed for extracting a blood vessel portion, such as a method using hue or luminance.
With respect to each of the ICG fluorescence image signal and fluorescein fluorescence image signal, an ICG fluorescence blood vessel image signal and a fluorescein fluorescence blood vessel image signal are generated by extracting a blood vessel in the manner as described above. The fluorescein fluorescence blood vessel image signal represents an image of a surface layer blood vessel located in a surface layer from the body surface of the observation area to a depth of 1 mm, while the ICG fluorescence blood vessel image signal includes both the surface layer blood vessel and a deep portion blood vessel located in a deep layer of a depth of 1 to 3 mm from the body surface.
Then, the ICG fluorescence blood vessel image signal and fluorescein fluorescence blood vessel image signal generated in blood vessel extraction section 33c are outputted to image calculation section 33d where a deep portion blood vessel image is generated based on these signals. More specifically, the deep portion blood vessel image is generated by subtracting the fluorescein fluorescence image signal from the ICG fluorescence image signal (
The deep portion blood vessel image generated in image calculation section 33d in the manner as described above is outputted to image combining section 33e. Image combining section 33e also receives the ordinary image signal outputted from ordinary image processing section 33a, and combines the ordinary image signal and deep portion blood vessel image signal to generate a composite image signal (
The composite image signal generated in image combining section 33e is outputted to video output section 35. Video output section 35 generates a display control signal by performing predetermine processing on the inputted composite image signal, and outputs the display control signal to monitor 4. Monitor 4 displays a composite image based on the inputted display control signal (
Next, a rigid endoscope system that employs a second embodiment of the image obtaining method and image capturing apparatus of the present invention will be described in detail. In the rigid endoscope system of the second embodiment obtains a narrowband image using green narrowband light instead of the fluorescein fluorescence image obtained in the rigid endoscope system of the first embodiment.
The overall configuration of the rigid endoscope system of the second embodiment is identical to that of the rigid endoscope system of the first embodiment shown in
Referring to
Referring to
Light source 6 is optically coupled to rigid endoscope device 10 through optical cable LC, in which optical cables LC1, LC2 are optically coupled to multimode optical fibers 51 of white light projection unit 50, optical cable LC3 is optically coupled to the multimode optical fiber of the green light projection unit, and optical cable LC4 is optically coupled to the multimode optical fiber of the near infrared light projection unit.
Each Spectrum of light outputted from each light projection unit provided inside of body cavity insertion section 30 of the present embodiment and spectra of fluorescence and reflection light emitted/reflected from an observation area irradiated with the light outputted from each light projection unit are shown in
The green light outputted from the green light projection unit has a wavelength of 530 nm to 550 nm which is shorter than that of the near infrared light and is narrowband light with a bandwidth of 20 nm which is narrower than that of the white light. In the present embodiment, the green light is used, but light in other wavelength ranges may be used as long as it has a shorter wavelength than that of the near infrared excitation light and a narrower bandwidth than that of the white light.
Now referring to
The first imaging system includes dichroic prism 81 that transmits the ICG fluorescence image emitted from the observation area, excitation light cut filter 82 that transmits the ICG fluorescence image transmitted through dichroic prism 81 and cuts the near infrared excitation light transmitted through dichroic prism 81, first image forming system 83 that forms the ICG image transmitted through excitation light cut filter 82, and first high sensitivity image sensor 84 that takes the ICG fluorescence image formed by first image forming system 83.
The second imaging system includes dichroic prism 81 that reflects the ordinary image and green narrowband image reflected from the observation area in a right angle direction, second image forming system 85 that forms the ordinary image and green narrowband image reflected by dichroic mirror, and second high sensitivity image sensor 86 that takes the ordinary image and green narrowband image formed by second image forming system 85 at different timing. Color filters of three primary colors, red (R), green (G), and blue (B) are arranged on the imaging surface of second high sensitivity image sensor 66 in a Beyer or honeycomb pattern.
The spectral sensitivity of imaging unit 80 is identical to that of the first embodiment illustrated in
Imaging unit 80 further includes imaging control unit 80a. Imaging control unit 80a is a unit that performs CDS/AGC (correlated double sampling/automatic gain control) and A/D conversion on image signals outputted from first and second high sensitivity image sensors 84, 86 and outputs the resultant image signals to image processing unit 3 through cable 5 (
The configuration of image processing unit is identical to that of rigid endoscope system of the first embodiment.
An operation of the rigid endoscope system of the second embodiment will now be described.
As described above, the rigid endoscope system of the present embodiment obtains the green narrowband image instead of the fluorescein fluorescence image obtained in the rigid endoscope system of the first embodiment and a deep portion blood vessel image is obtained by subtracting the green narrowband image from the ICG fluorescence image.
Hereinafter, a specific operation of the rigid endoscope system of the present embodiment will be described.
First, an operation of the system for capturing an ICG fluorescence image and an ordinary image will be described. When capturing an ICG fluorescence image and an ordinary image, blue light emitted from blue LD light source 40 of light source unit 6 is inputted to optical cables LC1, LC2 through condenser lens 41 and optical fiber splitter 43. Then, the blue light is guided through optical cables LC1 and LC2 and inputted to body cavity insertion section 30, and further guided through multimode optical fibers 51 of white light projection unit 50 in body cavity insertion section 30. Thereafter, a portion of the blue light outputted from the output end of each multimode optical fiber 51 is transmitted through fluorescent body 52 and directed to the observation area, while the remaining blue light other than the portion is subjected to wavelength conversion to green to yellow visible light by fluorescent body 52 and directed to the observation area. That is, the observation area is irradiated with white light formed of the blue light and green to yellow visible light.
In the mean time, near infrared light emitted from near infrared LID light source 46 of light source unit 6 is inputted to body cavity insertion section 30 through condenser lens 47 and optical cable LC4. Then, the near infrared light is guided through the multimode optical fiber of the near infrared light projection unit in body cavity insertion section 30 and directed to the observation area simultaneously with the white light.
Then, an ordinary image based on reflection light reflected from the observation area irradiated with the white light and an ICG fluorescence image based on ICG fluorescence emitted from the observation area irradiated with the near infrared light are captured simultaneously.
More specifically, an ordinary image is captured in the following manner. Reflection light reflected from the observation area irradiated with the white light is inputted to insertion member 30b from imaging lens 30d at the tip 30Y of insertion member 30b, then guided by the group of lenses inside of the insertion member 30b, and outputted to imaging unit 80.
The ordinary image inputted to imaging unit 80 is reflected by dichroic prism 81 in a right angle direction and formed on the imaging surface of second high sensitivity image sensor 86 by second image forming system 85 and imaged by second high sensitivity image sensor 86.
The R, G, B image signals outputted from second high sensitivity image sensor 86 are subjected to CDS/AGC (correlated double sampling/automatic gain control) and A/D conversion in imaging control unit 80a, and outputted to image processing unit 3 through cable 5.
In the mean time, the ICG fluorescence image is captured in the following manner. The ICG fluorescence image emitted from the observation area irradiated with the blue excitation light is inputted to insertion member 30b from imaging lens 30d at the tip 30Y of insertion member 30b, then guided by the group of lenses inside of the insertion member 30b, and outputted to imaging unit 80.
The ICG fluorescence image inputted to imaging unit 80 is transmitted through dichroic prism 81 and excitation light cut filter 82, and formed on the imaging plane of first high sensitivity image sensor 84 by first image forming system 83 and imaged by first high sensitivity image sensor 84. The ICG fluorescence image signal outputted from first high sensitivity image sensor 84 is subjected to CDS/AGC (correlated double sampling/automatic gain control) and A/D conversion in imaging control unit 80a, and outputted to image processing unit 3 through cable 5.
Next, an operation of the system for capturing a green narrowband image will be described. When capturing a green narrowband image, green narrowband light emitted from green wavelength conversion laser light source 70 of light source unit 6 is inputted to optical cable LC3 through condenser lens 71. Then, the green narrowband light is guided through optical cable LC3 and inputted to body cavity insertion section 30, and further guided through the multimode optical fiber of the green light projection unit in body cavity insertion section 30. Then, the green narrowband light is outputted from the output end of the multimode optical fiber and directed to the observation area.
A green narrowband image reflected from the observation area irradiated with the green narrowband light is inputted to insertion member 30b from imaging lens 30d at the tip 30Y of insertion member 30b, then guided by the group of lenses inside of the insertion member 30b, and outputted to imaging unit 80.
The green narrowband image inputted to imaging unit 80 is reflected in a right angle direction by dichroic prism 81, then formed on the imaging surface of second high sensitivity image sensor 86 by second image forming system 85, and imaged by second high sensitivity image sensor 86 through the green (G) filters on the imaging surface thereof.
The green narrowband image signal outputted from second sensitivity image sensor 86 are subjected to CDS/AGC (correlated double sampling/automatic gain control) and A/D conversion in imaging control unit 80a, and outputted to image processing unit 3 through cable 5.
Note that the imaging timing of the ordinary image, ICG fluorescence image, and green narrowband image is identical to that of A to C, E, and D of
Also note that blue LD light source 40 and near infrared LD light source 46 in light source unit 6 are drive controlled according to the timing charts of A to E of
Then, an ordinary image, an ICG fluorescence image, and a composite image are displayed based on the ordinary image signal formed of the R, G, and B signals, ICG fluorescence image signal, and green narrowband fluorescence image signal obtained by imaging unit 80 in the manner as described above. The operation of the system for displaying these images is identical to that of the rigid endoscope system of the first embodiment shown in the flowcharts of
Next, a rigid endoscope system that employs a third embodiment of the image obtaining method and image capturing apparatus of the present invention will be described in detail. In the rigid endoscope system of the third embodiment obtains a luciferase fluorescence image using ultraviolet light instead of the fluorescein fluorescence image obtained in the rigid endoscope system of the first embodiment.
The overall configuration of the rigid endoscope system of the third embodiment is identical to that of the rigid endoscope system of the first embodiment shown in
Referring to
The light source unit of the rigid endoscope system of the present embodiment is identical to light source unit 6 of the second embodiment except that an ultraviolet light source is provided instead of green wavelength conversion laser light source 70.
Ultraviolet light emitted from the ultraviolet laser light source of the present embodiment is inputted to optical cable LC3, guided through optical cable LC3, and inputted to the multimode optical fiber of the ultraviolet light projection unit.
Each Spectrum of light outputted from each light projection unit provided inside of body cavity insertion section 30 of the present embodiment and spectra of fluorescence and reflection light emitted/reflected from an observation area irradiated with the light outputted from each light projection unit are shown in
As shown in
Now referring to
Imaging unit 80 of the present embodiment is identical to imaging unit 80 of second embodiment except that it further includes ultraviolet light cut filter 87 for cutting ultraviolet light. Ultraviolet light cut filter 87 is formed of a high-pass filter for cutting the ultraviolet wavelength range of 375 nm and is provided at the light incident surface of dichroic prism 81. Other configurations are identical to those of imaging unit 80 of the second embodiment described above.
Further, the configuration of image processing unit 3 is identical to that of the rigid endoscope system of the first or second embodiment.
An operation of the rigid endoscope system of the third embodiment will now be described.
As described above, the rigid endoscope system of the present embodiment obtains a luciferase fluorescence image instead of the fluorescein fluorescence image obtained in the rigid endoscope system of the first embodiment and a deep portion blood vessel image is obtained by subtracting the luciferase fluorescence image from the ICG fluorescence image.
The operation of the system of the present embodiment for imaging the ICG fluorescence image and ordinary image is identical to that of the system of the second embodiment. Therefore, it will not be elaborated upon further here and only the operation for imaging a luciferase fluorescence image will be described. Although ultraviolet light cut filter 87 is added to imaging unit 80 of the present embodiment as described above, ultraviolet cut filter 87 is formed of a high-pass filter that passes the ICG fluorescence image and ordinary image, giving no influence on the operation for capturing these images.
When capturing a luciferase fluorescence image, ultraviolet light emitted from the ultraviolet laser light source of light source unit 6 is inputted to optical cable LC3 through condenser lens 71. Then, the ultraviolet light is guided through optical cable LC3 and inputted to body cavity insertion section 30, and further guided through the multimode optical fiber of the ultraviolet light projection unit in body cavity insertion section 30. Then, the ultraviolet light is outputted from the output end of the multimode optical fiber and directed to the observation area.
A luciferase fluorescence image reflected from the observation area irradiated with the ultraviolet light is inputted to insertion member 30b from imaging lens 30d at the tip 30Y of insertion member 30b, then guided by the group of lenses inside of the insertion member 30b, and outputted to imaging unit 80.
The luciferase fluorescence image is reflected in a right angle direction by dichroic prism 81 after passing through ultraviolet light cut filter 87, then formed on the imaging surface of second high sensitivity image sensor 86 by second image forming system 85, and imaged by second high sensitivity image sensor 86 through the blue (B) filters on the imaging surface thereof. Here, ultraviolet light reflected from the observation area is cut by ultraviolet light cut filter 87 and does not enter second high sensitivity image sensor 86.
The luciferase fluorescence image signal outputted from second sensitivity image sensor 86 are subjected to CDS/AGC (correlated double sampling/automatic gain control) and A/D conversion in imaging control unit 80a, and outputted to image processing unit 3 through cable 5.
Note that the imaging timing of the ordinary image, ICG fluorescence image, and luciferase fluorescence image is identical to that of A to C, E, and D of
Also note that blue LD light source 40, near infrared LD light source 46, and ultraviolet laser light source in light source unit 6 are drive controlled according to the timing charts of A to E of
Then, an ordinary image, an ICG fluorescence image, and a composite image are displayed based on the ordinary image signal formed of the R, G, and B signals, ICG fluorescence image signal, and luciferase fluorescence image signal obtained by imaging unit 80 in the manner as described above. The operation of the system for displaying these images is identical to that of the rigid endoscope system of the first embodiment shown in the flowcharts of
In the first to third embodiments described above, a blood vessel image is extracted, but images representing other tube portions, such as lymphatic vessels, bile ducts, and the like may also be extracted.
Further, in the first to third embodiments described above, the fluorescence image capturing apparatus of the present invention is applied to a rigid endoscope system, but the apparatus of the present invention may also be applied to other endoscope systems having a soft endoscope. Still further, the fluorescence image capturing apparatus of the present invention is not limited to endoscope applications and may be applied to so-called video camera type medical image capturing systems without an insertion section to be inserted into a body cavity.
Number | Date | Country | Kind |
---|---|---|---|
033534/2010 | Feb 2010 | JP | national |