The present invention relates to an electronic endoscope device capable of emitting light in different wavelengths at a living tissue and capturing a spectral image.
Recently, as described, for example, in Japanese Patent Provisional Publication No. JP2007-135989A, an electronic endoscope equipped with a function to capture spectral images has been proposed. With such an electronic endoscope, it may be possible to obtain image information containing spectral property (frequency characteristic of light absorption property) of a living tissue such as a mucous membrane in a digestive organ, which is, for example, a stomach or a rectum. It is known that the spectral property of a living tissue reflects information concerning types or densities of components contained in the vicinity of a surface layer of the subject living tissue. In particular, the spectral property of the living tissue can be obtained by superimposing spectral properties of a plurality of essential components which constitute the living tissue.
A diseased portion in a living tissue may contain a greater amount of substance, which is rarely contained in a healthy portion in the living tissue. Therefore, a spectral property of the living tissue containing the diseased portion may tend to differ from a spectral property of the living tissue containing only the healthy portion. Thus, while the spectral properties of the healthy portion and the diseased portion are different from each other, it may be possible to determine whether or not the living tissue contains any diseased portion by comparing the spectral properties of the healthy portion and the diseased portion.
Meanwhile, wavelength characteristics of scattering coefficients on human skin or mucous membrane have been researched, and it has been reported that the wavelength characteristic in scattering on the living tissue within a wavelength range from 400 to 2,000 nm substantially coincides with superimposed wavelength characteristics of Reyleigh scattering and Mie scattering (A. N. Bashkatov, et al., including three other authors, “Optical properties of human skin, subcutaneous and mucous tissues in the wavelength range from 400 to 2000 nm,” JOURNAL OF PHYSICS D: APPLIED PHYSICS, 2005, vol. 38, p. 2543-2555, hereinafter referred to as “non-patent document 1”).
While an endoscopic image of a living tissue is formed mainly with observation light reflected on a surface of the living tissue, the observation light may include not only the light reflected on the surface of the living tissue but also include scattered light caused in the living tissue. However, while it has been difficult to accurately determine a degree of influence of the scattered light in the captured image, the influence of the scattered light has been conventionally ignored in analysis of the spectral image. The inventor of the present invention has found a method to quantitatively evaluate the influence of the scattered light by using spectral image data, and by evaluating the observation light (i.e., an observation image) according to the method, the inventor discovered the degree of the influence of the scattered light in the observation light is greater than that having been believed so that the greater degree of influence of the scattered light has caused noise in the evaluation of the spectral property of the living tissue.
The present invention is made to solve the circumstance described above. Namely, the object of the present invention is to provide an electronic endoscope device capable of eliminating the influence of the scattered light and the like and displaying a highly contrasted image, in which the diseased portion and the healthy portion are easily recognizable.
To achieve the above described object, the electronic endoscope device according to the present invention is provided with a spectral image capturing means for capturing a spectral image in a body cavity within a predetermined wavelength range and obtaining spectral image data; a spectrum resolving means for resolving spectrum data for each of pixels contained in the spectral image data into a plurality of predetermined component spectra by performing a regression analysis; a spectrum compositing means for generating composite image data by removing at least one of the plurality of component spectra to recompose the plurality of predetermined component spectra; and a display means for displaying a screen based on the composite image data.
According to the configuration, the composite image data is generated after the component spectra acting as noise components are removed; therefore, an image, which provides higher contrast and in which the healthy portion and the diseased portion are easily identified, can be displayed.
Optionally, the plurality of component spectra includes, for example, an absorption spectrum of oxyhemoglobin, an absorption spectrum of deoxyhemoglobin, and a spectrum of a scattering coefficient. The spectrum resolving means may be configured to perform the regression analysis with the spectral data acting as an objective variable and with the absorption spectrum of oxyhemoglobin, the absorption spectrum of deoxyhemoglobin, and the spectrum of the scattering coefficient acting as explanatory variables. Optionally, the spectrum compositing means may be configured to recompose the absorption spectrum of oxyhemoglobin and the absorption spectrum of deoxyhemoglobin. In this case, it is preferable that the spectrum of the scattering coefficient includes a spectrum of a scattering coefficient in Rayleigh scattering and a spectrum of a scattering coefficient in Mie scattering. According to these configurations, by eliminating influence of the scattered light, more accurate regression coefficients of oxyhemoglobin and deoxyhemoglobin can be obtained, and purpose-specific composite image data, such as depending on concentration of oxyhemoglobin and deoxyhemoglobin, can be generated.
Optionally, the plurality of component spectra may include a spectrum indicating an offset which is specific to the electronic endoscope device. According to the configuration, the device-specific offset is removed; therefore, it is not necessary to calibrate the electronic endoscope device.
Optionally, the spectrum compositing means may be configured to obtain an average value of the recomposed component spectra and generate the composite image data with the average value acting as a pixel value. According to the configuration, the composite image data depending on the concentration of oxyhemoglobin and deoxyhemoglobin can be easily generated.
Optionally, it is preferable that the predetermined wavelength range is from 400 to 800 nm, and the spectral image includes a plurality of images captured in the wavelengths at a predetermined interval defined within a range from 1 to 10 nm.
Optionally, it is preferable that the regression analysis is a multiple regression analysis
As described above, according to the electronic endoscope of the present invention, by eliminating the influence of the scattered light and the like, it is possible to display an image, which provides higher contrast and in which the diseased portion and the healthy portion are easily identified.
Hereinafter, an embodiment according to the present invention is described with reference to the accompanying drawings.
The electronic endoscope 100 includes an insertion tube 110, which is to be inserted into a body cavity. At a tip-end portion (an insertion tube tip-end portion) 111 of the insertion tube 110, disposed is an objective optical system 121. An image of a living tissue T around the insertion tube tip-end portion 111 through the objective optical system 121 is formed on a light-receiving surface of an image capturing device 141 installed in the insertion tube tip-end portion 111.
The image capturing device 141 periodically (e.g., at an interval of 1/30 seconds) outputs an image signal corresponding to the image formed on the light-receiving surface. The image signal output by the image capturing device 141 is transmitted to the image processing unit 500 in the processor 200 for the electronic endoscope via a cable 142.
The image processing unit 500 includes an AD conversion circuit 510, a temporary memory 520, a controller 530, a video memory 540, and a signal processing circuit 550. The AD conversion circuit 510 converts the image signal input from the image capturing device 141 of the electronic endoscope 100 via the cable 142 analog-to-digitally and outputs the digital image data. The digital image data output from the AD conversion circuit 510 is transmitted to the temporary memory 520 and stored therein. The controller 530 processes a piece of or a plurality of pieces of image data stored in the temporary memory 520 to generate one piece of displayable image data, and transmits the displayable image data to the video memory 540. For example, the controller 530 may produce displayable image data, such as data generated from a piece of image data, data to display a plurality of aligned images, or data to display an image obtained through image computation of a plurality of image data or a graph obtained from a result of the image computation, and store the produced displayable image data in the video memory 540. The signal processing circuit 550 converts the displayable image data stored in the video memory 540 into a video signal having a predetermined format (e.g., an NTSC format) and outputs the video signal. The video signal output from the signal processing circuit 550 is input to the image display device 300. As a result, an endoscopic image captured by the electronic endoscope 100 is displayed on the image display device 300.
In the electronic endoscope 100, a light guide 131 is installed. A tip-end portion 131a of the light guide 131 is disposed in the vicinity of the insertion tube tip-end portion 111. Meanwhile, a proximal-end portion 131b of the light guide 131 is connected to the processor 200 for the electronic endoscope. The processor 200 for the electronic endoscope includes therein the light source unit 400 (described later) having a light source 430 etc., which generates a large amount of white light, e.g., a Xenon lamp. The light generated by the light source unit 400 enters the light guide 131 through the proximal-end portion 131b. The light entering the light guide 131 through the proximal-end portion 131b is guided to the tip-end portion 131a by the light guide 131 and is emitted from the tip-end portion 131a. In the vicinity of the tip-end portion 131a of the light guide 131 in the insertion tube tip-end portion 111 of the electronic endoscope 100, a lens 132 is disposed. The light emitted from the tip-end portion 131a of the light guide 131 passes through the lens 132 and illuminates the living tissue T near the insertion tube tip-end portion 111.
As described above, the processor 200 for the electronic endoscope has both the function as a video processor, which processes the image signal output from the image capturing device 141 of the electronic endoscope 100, and the function as a light source device, which supplies illumination light for illuminating the living tissue T near the insertion tube tip-end portion 111 of the electronic endoscope 100 to the light guide 131 of the electronic endoscope 100.
In the present embodiment, the light source unit 400 in the processor 200 for the electronic endoscope includes the light source 430, a collimator lens 440, a spectral filter 410, a filter control unit 420, and a condenser lens 450. The white light emitted from the light source 430 is converted by the collimator lens 440 into a collimated beam, passes through the spectral filter 410, and then through the condenser lens 450 enters the light guide 131 from the proximal-end portion 131b. The spectral filter 410 is a disk-shaped filter, which breaks down the white light emitted from the light source 430 into light of a predetermined wavelength (i.e., selects a wavelength), and selectively filters and outputs light of a narrow band of 400 nm, 405 nm, 410 nm, . . . 800 nm (a bandwidth of approximately 5 nm) depending on a rotation angle thereof. The rotation angle of the spectral filter 410 is controlled by the filter control unit 420 connected to the controller 530. While the controller 530 controls the rotation angle of the spectral filter 410 via the filter control unit 420, the light of a predetermined wavelength enters the light guide 131 from the proximal-end portion 131b, and the living tissue T near the insertion tube tip-end portion 111 is illuminated. Then, the light reflected on the living tissue T and the light scattered in the living tissue T are converged on the light-receiving surface of the image capturing device 141 to form the image as described above, and the image signal corresponding to the formed image is transmitted to the image processing unit 500 via the cable 142.
The image processing unit 500 is a device configured to obtain a plurality of spectral images at wavelengths at an interval of 5 nm from the image of the living tissue T input via the cable 142. Specifically, the image processing unit 500 obtains the spectral images of the wavelengths when the spectral filter 410 selects and outputs the light in the narrow band (a bandwidth of approximately 5 nm) having the center wavelengths of 400 nm, 405 nm, 410 nm, . . . 800 nm respectively.
The image processing unit 500 has the function to process a plurality of spectral images generated through the spectral filter 410 and generate a colored image (a composite spectral image), as described later. Moreover, the image processing unit 500 controls the image display device 300 to display the processed composite spectral image.
As the spectral filter 410, for example, a Fabry-Perot filter or a filter employing a known spectral image capturing method, by which separated light can be obtained with use of a transmission-typed diffraction grating, may be available.
As described above, the image processing unit 500 according to the present embodiment has the function to generate a composite spectral image, which is in a high resolution and in which a healthy portion and a diseased portion can be easily recognized, by using a plurality of spectral images of different wavelengths. The function to generate the composite spectral image will be described below.
As shown in
A measurement model of the spectral image data is expressed in the following Expression 1.
X(λ)=A(λ)+SMie(λ)+SRayleigh(λ)+F (EXPRESSION 1)
X represents data for a single pixel in the spectral image of the gastric mucosa (logarithmic representation), λ represents a wavelength of light, A represents an absorption coefficient of a medium (the living tissue T), SRayleigh represents a scattering coefficient of the medium in Rayleigh scattering, SMie represents a scattering coefficient of the medium in Mie scattering, and F represents the device-specific offset. In this regard, the device-specific offset F is a parameter indicating a reference signal intensity for the image capturing device 141. As described in Expression 1, the spectral image data X in the present embodiment is represented in a sum of the absorption coefficient A, the scattering coefficient SRayleigh in Rayleigh scattering, the scattering coefficient SMie in Mie scattering, and the device-specific offset F. The absorption coefficient A is expressed as Expression 2 described below based on Beer-Lambert Law.
A represents the absorption coefficient of the medium (the living tissue T), Jo represents an emission intensity of light before entering the medium, I represents an intensity of light travelled in the medium for a distance of d, ε represents a molar light absorption coefficient, and C represents a mol concentration. If the medium has n types of light-absorbing substances, then the absorption coefficient A is expressed in Expression 3 described below.
That is, when the medium has n types of light-absorbing substances, the absorption coefficient A is expressed as a sum of the absorption properties of the light-absorbing substances. Therefore, the multiple regression analysis was performed, as described below in Expression 4, with the spectral image data of the gastric mucosa shown in
X represents the data for a single pixel in the spectral image and expresses logarithmically the brightness value of the spectral image, which can be obtained by irradiating the light having the center wavelengths ranging from 400 nm to 800 nm at every 5 nm. Meanwhile, a represents the light absorption property of oxyhemoglobin at every 5 nm within the wavelengths from 400 nm to 800 nm (
S
Mie(λ)=73.7λ−0.22 (EXPRESSION 5)
S
Rayleigh(λ)=1.1×1012λ−4 (EXPRESSION 6)
The last term in Expression 4 is a constant term corresponding to the device-specific offset F, and, in the present embodiment, the multiple regression coefficient P5 is equal to the device-specific offset F.
Usually, the signal intensity of the image capturing device 141 is not calibrated; therefore, it is common that an absolute value of the intensity of the video signals generated by the image capturing device 141 contains more than a small quantity of errors. Moreover, a reference level of the video signals may fluctuate depending on observation conditions (for example, ambient brightness in an observation area). If the multiple regression analysis is performed with the errors contained in the video signals maintained therein, an accurate analysis result (i.e., result with a smaller quantity of residual errors) cannot be obtained. Accordingly, in the present embodiment, by performing the multiple regression analysis in consideration of the device-specific offset as the explanatory variables, as expressed in Expression 4, the reference level is automatically and properly corrected without calibrating the signal intensity of the image capturing device 141. Thus, the multiple regression analysis with higher accuracy can be achieved.
With the multiple regression analysis (i.e., fitting) based on Expression 4, the data for the single pixel in the spectral image is resolved into spectrum of the light absorption property of oxyhemoglobin, spectrum of the light absorption property of deoxyhemoglobin, spectrum of the scattering coefficient in Rayleigh scattering, spectrum of the scattering coefficient in Mie scattering, and spectrum of the device-specific offset so that contribution rate of the spectra (component spectra) as the multiple regression coefficients P1-P5 respectively are obtained. In other words, the multiple regression coefficients P1-P5 are in fact the coefficients which indicate the component ratio of the elements (i.e., oxyhemoglobin, deoxyhemoglobin, Rayleigh scattering, Mie scattering, the device-specific offset) constituting the data for the single pixel in the spectral image. Therefore, it is possible to obtain the component ratio of oxyhemoglobin and deoxyhemoglobin in the living tissue T from the multiple regression coefficient P1 of oxyhemoglobin and the multiple regression coefficient P2 of deoxyhemoglobin obtained from the multiple regression analysis. Accordingly, thereby it is possible to substantially determine the healthy portion and the diseased portion. Meanwhile, however, in order to determine the healthy portion and the diseased portion based on the multiple regression coefficients as a type of index, it is necessary to associate the endoscopic image currently under observation with the healthy portion and the diseased portion, and to illustrate which portion in the endoscopic image is the healthy portion or the diseased portion to the operator. Therefore, in the present embodiment, by recomposing an image based on the multiple regression coefficient P1 of oxyhemoglobin and the multiple regression coefficient P2 of deoxyhemoglobin, the multiple regression coefficient P1 of oxyhemoglobin and the multiple regression coefficient P2 of deoxyhemoglobin are fed back in the endoscopic image to be shown to the operator. More specifically, by using estimate values for P1 and P2 obtained from the multiple regression analysis, composite absorption spectrum X* is generated based on Expression 7. That is, by Expression 7, only the spectrum of the light absorption property of oxyhemoglobin and the spectrum of the light absorption property of deoxyhemoglobin are recomposed.
As can be seen in comparison between Expression 7 and Expression 4, in the composite absorption spectrum X*, Rayleigh scattering, Mie scattering, and the device-specific offset are regarded as noise components and are eliminated. Therefore, based on the composite absorption spectrum X*, generation of an image (composite spectral image), from which the influence of scatterings and the device-specific offset are removed, is enabled.
When the colored image in
Hereafter, an image generation process executed by the image processing unit 500 according to the present embodiment is explained.
When the routine is started, step S1 is processed. In step S1, the image processing unit 500 transmits a control signal to the filter control unit 400 to obtain a spectral image. When the control signal is received, the filter control unit 400 controls a rotation angle of the spectral filter 410 to sequentially select the wavelengths of light having narrow bands (a bandwidth of approximately 5 nm) of 400, 405, 410, . . . 800 nm. The image processing unit 500 captures the spectral image obtained at each wavelength and stores it in the temporary memory 520. Then, the process proceeds to step S2.
In step S2, an average value of the spectrum data for each pixel in the spectral images obtained in step S1, and a piece of color image data is generated based on the average value acting as the pixel value. The color image data corresponds to the average brightness value of the spectrum from 400 nm to 800 nm; therefore, a colored image equivalent to the endoscopic image under normal observation (i.e., an endoscopic image obtained by white light) is generated. Then, the image processing unit 500 transmits the generated color image data to the video memory 540 and controls the image display device 300 to display the image on a left side of the screen. As a result, the image as shown in
In step S3, it is determined whether an operation unit (not shown) of the processor 200 for the electronic endoscope is operated and thereby a trigger input instructing generation of the composite spectral image occurred while the step S1 or S2 was being processed. When no trigger input occurred (S3: NO), the flow returns to S1 to obtain the spectral image again. That is, unless the trigger input occurs, the colored image obtained from the spectral image continues to be displayed on the image display device 300 in a sequential updating manner. On the other hand, when the trigger input occurred during execution of step S1 to S2 (S3: YES), the flow proceeds to step S4.
In step S4, the multiple regression analysis is performed on the spectral images obtained in step S1. More specifically, the multiple regression coefficients P1-P5 are calculated by using Expression 4 for every pixel in the spectral images obtained in step S1. Then, the flow proceeds to step S5.
In step S5, the composite absorption spectrum X* is generated by using Expression 7 for each of the multiple regression coefficients P1 and P2 calculated in step S4 for each pixel. Next, the flow proceeds to step S6.
In step S6, the composite spectral image is generated based on the composite absorption spectra X* calculated in step S5. More specifically, an average value of the composite absorption spectra X* is calculated for each pixel, and based on the average values acting as the pixel values, the composite spectral image data (composite image data) is generated. Then, the generated composite spectral image data is transmitted to the video memory 540 and is displayed on a right side of the screen on the image display device 300. As a result, the image as shown in
In step S7, the image processing unit 500 displays, on the image display device 300, a message inquiring about whether to generate again the composite spectral image and accepts an input from the operation unit (not shown) of the processor 200 for the electronic endoscope. When the user of the electronic endoscope device 1 operates the operation unit and selects re-generating of the composite spectral image (S7: YES), the flow returns to step S1. On the other hand, no re-generating of the composite spectral image is instructed for a predetermined time period (e.g., several seconds) (S7: NO), the flow proceeds to step S8.
In step S8, the image processing unit 500 displays, on the image display device 300, a message inquiring about whether to terminate displaying of the composite spectral image and accepts an input from the operation unit (not shown) of the processor 200 for the electronic endoscope. When the user of the electronic endoscope device 1 operates the operation unit and selects termination of the composite spectral image (S8: YES), the routine is terminated. On the other hand, if no displaying of the composite spectral image is instructed for a predetermined time period (e.g., several seconds) (S8: NO), the flow proceeds to step S7.
As described above, through execution of the routine shown in the flowchart of
In the present embodiment, the image processing unit 500 is in the configuration to perform the multiple regression analysis by using the entire spectral image data obtained at every 5 nm in the wavelength range from 400 to 800 nm; however, the present invention is not limited to such a configuration. For example, the wavelength range may be a narrower range including the wavelength bandwidth from 500 nm to 590 nm, which is the absorption wavelength bandwidth of oxyhemoglobin and deoxyhemoglobin, and reference values required for standardizing each pixel. For another example, a configuration to perform a multiple regression analysis by using only the spectrum image data for the wavelength bandwidth from 500 nm to 590 nm, which is the absorption wavelength bandwidth of oxyhemoglobin and deoxyhemoglobin. For another example, as long as a spectrum for a pixel corresponding to the diseased portion and a spectrum for a pixel corresponding to the healthy portion are recognizable, the spectral image data may not necessarily be obtained at the interval of 5 nm. The interval of the wavelengths to obtain the spectral image data may be, for example, selectable within a range from 1 to 10 nm.
Further, in the present embodiment, a configuration to achieve fitting by the multiple regression analysis is employed; however, another linear regression analysis, such as a multiple regression analysis of non-negative constraints and a least squares method, or an optimization method other than linear regression analyses, such as Newton's method, quasi-Newton's method, a conjugate gradient method, and a damped least squares method, may be applied as long as the fitting (optimization) is achieved based on a multivariate analysis.
Number | Date | Country | Kind |
---|---|---|---|
2012-114338 | May 2012 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2013/061869 | 4/23/2013 | WO | 00 |