1. Field of the Invention
The present invention relates to a medical image processing apparatus and a medical image processing method, and, more particularly to a medical image processing apparatus and a medical image processing method for applying image segmentation processing to a medical image obtained by picking up an image of a living tissue.
2. Description of the Related Art
A technique for detecting, on the basis of a calculation result of a feature value of an image obtained by picking up an image of a living tissue, an area where a predetermined object is present in the image has been conventionally used. A method of subjecting an image to image segmentation by a dynamic contour detecting method that can perform a phase change such as a Level-Set method is disclosed in Yasushi Yagi and Hideo Saito eds.: “Computer Vision Forefront Guide <1>”, pp. 1-28, Advanced Communication Media Co., Ltd. (2009), Tony F. Chan and Luminita A. Vese: “Active Contours Without Edges”, IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 10, NO. 2, February 2001, pp. 266-277, and Christophe Samson, Laure Blanc-Feraud, Gilles Aubert, and Josiane Zerubia: “Multiphase Evolution and Variational Image Classification”, INRIA Sophia Antipolis (1999) (hereinafter referred to as Non-Patent Documents).
Further, for example, Japanese Patent Application Laid-Open Publication No. 2007-313119 discloses a technique for calculating edge intensity of a pixel in an intra-body cavity image, calculating a correlation value between the edge intensity and a bubble model set in advance on the basis of a characteristic of a bubble image, subjecting the intra-body cavity image to image segmentation on the basis of an edge of the intra-body cavity image, and detecting a bubble area in the intra-body cavity image on the basis of the correlation value and a result of the image segmentation.
A medical image processing apparatus according to an aspect of the present invention is a medical image processing apparatus capable of performing image segmentation processing for a medical image including: a feature value calculating unit that calculates at least one feature value from the medical image; a category setting unit that sets, on the basis of the feature value calculated by the feature value calculating unit, plural categories in the medical image; a parameter setting unit that sets, on the basis of the feature value calculated by the feature value calculating unit, a parameter for each of areas of the medical image respectively classified into the categories set by the category setting unit; and an image segmenting unit that performs the image segmentation processing for the medical image according to an arithmetic operation of a contour detecting method employing a dynamic contour model specified according to the number of categories set by the category setting unit and parameters set by the parameter setting unit.
A medical image processing apparatus according to another aspect of the present invention is a medical image processing apparatus capable of performing image segmentation processing for a medical image including: a feature value calculating unit that calculates at least one feature value from the medical image; a structural component area setting unit that sets plural structural component areas corresponding to structural components included in the medical image; an area integrating unit that integrates areas having the feature value similar to each other in the respective plural structural component areas; a category setting unit that sets categories on the basis of a result of the integration by the area integrating unit; a parameter setting unit that sets, on the basis of the feature value in the respective structural component areas obtained as the result of the integration by the area integrating unit, a parameter for each of the structural component areas obtained as the integration result; and an image segmenting unit that performs the image segmentation processing for the medical image according to an arithmetic operation of a contour detecting method employing a dynamic contour model specified according to the number of categories set by the category setting unit and parameters set by the parameter setting unit.
A medical image processing method according to still another aspect of the present invention is a medical image processing method for performing image segmentation processing for a medical image including: a feature value calculating step for calculating at least one feature value from the medical image; a category setting step for setting, on the basis of the feature value calculated by the feature value calculating step, plural categories in the medical image; a parameter setting step for setting, on the basis of the feature value calculated by the feature value calculating step, a parameter for each of areas of the medical image respectively classified into the categories set by the category setting step; and an image segmenting step for performing the image segmentation processing for the medical image according to an arithmetic operation of a contour detecting method employing a dynamic contour model specified according to the number of categories set by the category setting step and parameters set by the parameter setting step.
A medical image processing method according to still another aspect of the present invention is a medical image processing method for performing image segmentation processing for a medical image including: a feature value calculating step for calculating at least one feature value from the medical image; a structural component area setting step for setting plural structural component areas corresponding to structural components included in the medical image; an area integrating step for integrating areas having the feature value similar to each other in the respective plural structural component areas; a category setting step for setting categories on the basis of a result of the integration by the area integrating step; a parameter setting step for setting, on the basis of the feature value in the respective structural component areas obtained as the result of the integration by the area integrating step, a parameter for each of the structural component areas obtained as the integration result; and an image segmenting step for performing the image segmentation processing for the medical image according to an arithmetic operation of a contour detecting method employing a dynamic contour model specified according to the number of categories set by the category setting step and parameters set by the parameter setting step.
Embodiments of the present invention are explained below with reference to the drawings.
An endoscope apparatus 1 includes, as shown in
The endoscope 2 includes an insertion portion 21a having a shape and a dimension for allowing insertion into the body cavity of the examinee, a distal end portion 21b provided on a distal end side of the insertion portion 21a, and an operation portion 21c provided on a proximal end side of the insertion portion 21a. A light guide 7 for transmitting the illumination light emitted in the light source device 3 to the distal end portion 21b is inserted through an inside of the insertion portion 21a.
One end face (a light incident end face) of the light guide 7 is detachably connected to the light source device 3. The other end face (a light exit end face) of the light guide 7 is arranged near a not-shown illumination optical system provided at the distal end portion 21b of the endoscope 2. With such a configuration, the illumination light emitted in the light source device 3 is emitted to the living tissue 101 after passing through the light guide 7 connected to the light source device 3 and the not-shown illumination optical system provided at the distal end portion 21b.
An object optical system 22 that forms an optical image of a subject and a CCD 23 that picks up the optical image formed by the object optical system 22 and acquires an image are provided at the distal end portion 21b of the endoscope 2. An observation mode changeover switch 24 that can perform an instruction for switching an observation mode to a normal light observation mode or a narrowband light observation mode is provided in the operation portion 21c of the endoscope 2.
The light source device 3 includes a white light source 31 including a Xenon lamp, a rotating filter 32 that changes white light emitted from the white light source 31 to frame-sequential illumination light, a motor 33 that drives to rotate the rotating filter 32, a motor 34 that moves the rotating filter 32 and the motor 33 in a direction perpendicular to an emission optical path of the white light source 31, a rotating filter driving unit 35 that drives the motors 33 and 34 on the basis of control by the processor 4, and a condensing optical system 36 that condenses illumination light passed through the rotating filter 32 and supplies the illumination light to the incident end face of the light guide 7.
As shown in
The first filter group 32A includes an R filter 32r that transmits light in a wavelength band of red, a G filter 32g that transmits light in a wavelength band of green, and a B filter 32b that transmits light in a wavelength band of blue. The filters 32r, 32g, and 32b are provided along the circumferential direction on the inner circumferential side of the rotating filter 32.
For example, as shown in
In other words, the white light emitted in the white light source 31 passes through the first filter group 32A, whereby wideband light for the normal light observation mode is generated.
The second filter group 32B includes a Bn filter 321b that transmits blue and narrowband light and a Gn filter 321g that transmits green and narrowband light. The filters 321b and 321g are provided along the circumferential direction on the outer circumferential side of the rotating filter 32.
For example, as shown in
For example, as shown in
In other words, the white light emitted in the white light source 31 is dispersed through the second filter group 32B, whereby narrowband light in plural bands for the narrowband light observation mode is generated.
The processor 4 includes a function of an image processing apparatus. Specifically, the processor 4 includes an image processing unit 41 and a control unit 42. The image processing unit 41 includes an image data generating unit 41a, an arithmetic unit 41b, and a video signal generating unit 41c.
The image data generating unit 41a of the image processing unit 41 applies processing such as noise removal and A/D conversion to an output signal from the endoscope 2 on the basis of control by the control unit 42 to thereby generate image data corresponding to an image obtained in the CCD 23.
The arithmetic unit 41b of the image processing unit 41 performs predetermined processing using the image data generated by the image data generating unit 41a to thereby perform image segmentation processing for image data obtained by picking up an image of the living tissue 101. In the present embodiment, it is assumed that a blood vessel is included in the image data and image segmentation processing for distinguishing an area where the blood vessel is present from other areas in the image data is performed. Details of such image segmentation processing are explained later in detail.
The video signal generating unit 41c of the image processing unit 41 applies processing such as gamma conversion and D/A conversion to the image data generated by the image data generating unit 41a to thereby generate and output a video signal.
When it is detected that an instruction for switching the observation mode to the normal light observation mode is performed on the basis of an instruction of the observation mode changeover switch 24, the control unit 42 applies control for causing the light source device 3 to emit wideband light for the normal light observation mode to the rotating filter driving unit 35. The rotating filter driving unit 35 causes, on the basis of control by the control unit 42, the motor 34 to operate to interpose the first filter group 32A on the emission optical path of the white light source 31 and retract the second filter group 32B from the emission optical path of the white light source 31.
When it is detected that an instruction for switching the observation mode to the narrowband light observation mode is performed on the basis of an instruction of the observation mode changeover switch 24, the control unit 42 applies control for causing the light source device 3 to emit narrowband light in plural bands for the narrowband light observation mode to the rotating filter driving unit 35. The rotating filter driving unit 35 causes, on the basis of control by the control unit 42, the motor 34 to operate to interpose the second filter group 32B on the emission optical path of the white light source 31 and retract the first filter group 32A from the emission optical path of the white light source 31.
In other words, with the configuration of the endoscope apparatus 1 explained above, when the normal light observation mode is selected, it is possible to cause the display device 5 to display an image having a color tone substantially the same as a color tone of an object seen by naked eyes (a normal light image) and (or) cause the external storage device 6 to store the image. With the configuration of the endoscope apparatus 1 explained above, when the narrowband light observation mode is selected, it is possible to cause the display device 5 to display an image with a blood vessel included in the living tissue 101 highlighted (a narrowband light image) and (or) cause the external storage device 6 to store the image.
Actions of the present embodiment are explained below.
After turning on a power supply for the units of the endoscope apparatus 1, a surgeon selects the normal light observation mode in the observation mode changeover switch 24. The surgeon inserts the endoscope 2 into a body cavity while looking at an image displayed on the display device 5 when the normal light observation mode is selected, i.e., an image having a color tone substantially the same as a color tone of an object seen by naked eyes to thereby bring the distal end portion 21b close to an area where the living tissue 101 as an observation target is present.
When the normal light observation mode is selected in the observation mode changeover switch 24, lights of respective colors, i.e., R light, G light, and B light are sequentially emitted from the light source device 3 to the living tissue 101. Images respectively corresponding to the lights of the respective colors are acquired in the endoscope 2.
When the image corresponding to the R light, the image corresponding to the G light, and the image corresponding to the B light are inputted, the image data generating unit 41a of the image processing unit 41 generates image data respectively corresponding to the images (step S1 in
In the present embodiment, it is assumed that image data that is formed by three planes for R, G, and B and includes an image size of a horizontal direction ISX and a vertical direction ISY and gradations of each of the pixels are 8 bits of 0 to 255 is generated by the image data generating unit 41a. In the following explanation, pixel values of jth (1≦j≦ISX×ISY) pixels in the planes for R, G, and B are respectively represented as rj, gj, and bj.
The arithmetic unit 41b of the image processing unit 41 calculates, on the basis of the image data generated by the image data generating unit 41a, for each of the pixels of the image data, a predetermined color-tone feature value used for the following processing (step S2 in
More specifically, the arithmetic unit 41b in the present embodiment calculates, for each of the pixels (in all first to (ISX×ISY) pixels), a value of gj/rj, which is a ratio of a pixel value rj of a jth pixel of the R plane and a pixel value gj of a jth pixel of the G plane, as the predetermined color-tone feature value (a color-tone feature value calculated on the basis of image data generated when the normal light observation mode is selected).
In the present embodiment, the following explanation is made assuming that the pixel value rj≠0 always holds and assuming that an occurrence frequency of the predetermined color-tone feature value conforms to a normal distribution (a multivariate normal distribution).
Thereafter, the arithmetic unit 41b of the image processing unit 41 performs, using the predetermined color-tone feature value calculated in step S2 in
Details of the image segmentation processing in the present embodiment are explained. In the following explanation, for simplification of the explanation, it is assumed that processing is applied to schematic image data in which an area equivalent to an inside of a blood vessel (an inner side of the blood vessel) is represented as a dot pattern, an area equivalent to a background mucous membrane (an outer side of the blood vessel) is represented as white, and a boundary line of these two areas are represented as a thin solid line, for example, as shown in
The arithmetic unit 41b performs an arithmetic operation for estimating the number of categories and a parameter for each of the categories in an arithmetic operation of a contour detecting method employing a dynamic contour model similar to a Mumford-Shah model disclosed in the Non-Patent Document (Christophe Samson, Laure Blanc-Feraud, Gilles Aubert, and Josiane Zerubia: “Multiphase Evolution and Variational Image Classification”, INRIA Sophia Antipolis) (step S3 in
First, processing for extracting an area having clear various structural components and a background area such as an interstitial portion from an image and calculating the number of categories and a parameter for each of the categories based on feature values of these areas is explained.
The arithmetic unit 41b extracts, by applying a convolutional operation employing a publicly-known edge detection filter (e.g., a filter having a size 5×5 shown in
In the image Ga obtained on the basis of a micro structure of a mucous membrane surface in a normal endoscopic image when a stain or the like is not used, since hemoglobin absorbs a wavelength component included in the G light, a pixel value of a blood vessel is relatively small with respect to a pixel value of a peripheral mucous membrane. In the image Ga, since the wavelength component included in the G light is reflected, a pixel value of a pit is relatively large with respect to the pixel value of the peripheral mucous membrane. It is possible to extract various categories by making use of such a characteristic.
On the other hand, the arithmetic unit 41b creates a large edge map L different from the edge map M in order to extract a clear edge of a blood vessel. Specifically, the arithmetic unit 41b creates the large edge map L by binarizing and extracting each pixel in a range of |gaj|>Th1 (e.g., Th1=10). The threshold Th1 is not limited to a threshold set as a fixed value and may be, for example, a threshold adaptively determined to include pixel values of higher order 20% in a histogram related to the pixel value gaj.
A reason for creating the large edge map L is as explained below.
According to the schematic image data shown in
Subsequently, the arithmetic unit 41b performs, by alternately referring to the edge map M and the large edge map L, processing for selecting (one or plural) linked components mc that overlap a pixel in which it is determined that a large edge is present in the large edge map L among linked components m1 to mC of the edge map M. According to such processing, it is possible to extract only (one or plural) linked components in which it is highly likely that an image of a blood vessel is picked up equivalent to an area having a large edge among the liked components of the edge map M.
Further, the arithmetic unit 41b extracts, on the basis of a processing result of the processing, respective closed areas formed by the linked components mc determined as linked components having a large edge using, for example, a closed area detecting method disclosed in Japanese Patent Application Laid-Open Publication No. 11-003428. Representative areas Ad (1≦d≦D) extracted in this way are extracted as areas including various structural components set as targets of image segmentation in an image and areas including relatively clear structural components among the various structural components.
The arithmetic unit 41b regards, as a global background area, a pixel group not extracted as an edge in the edge map M and extracts the pixel group.
Thereafter, the arithmetic unit 41b repeats processing for integrating areas having values of color-tone feature values gj/rj similar to each other as one category in the extracted each representative area Ad to thereby estimate the number of all categories K (1≦K) included in the schematic image data shown in
Subsequently, in the respective K categories obtained by the processing explained above, the arithmetic unit 41b calculates μGRi and σGRi (1≦i≦K), which are an average and a standard deviation of values of the color-tone feature values gj/rj.
Concerning the calculation of the number of categories and a parameter, a processing target area can also be set as an area corresponding to manual operation by the surgeon or the like. In such a case, the arithmetic unit 41b only has to set an area including a structural component such as a blood vessel and a background mucous membrane area on the basis of input operation performed in a not-shown input interface, give category numbers i (1≦i≦K, K is the number of categories) to the set areas, and acquire a calculation result obtained by calculating an average and a standard deviation of the color-tone feature values gj/rj for each of the areas to which the category numbers are given.
In the estimation methods enumerated above, it is possible to improve estimation accuracy for a parameter by, before estimating a parameter, excluding a pixel having a value of a color-tone feature value that could occur redundantly in plural categories.
Concerning a pixel not suitable for calculation of a color-tone feature value such as a pixel not having sufficient brightness and a high-luminance pixel like halation, estimation of a parameter may be performed after the pixel is excluded by threshold processing.
Further, in the series of processing explained above, the processing is not limited to processing for performing estimation of a parameter on the basis of the image Ga in which the edge detection filter is applied to the G plane and may be, for example, processing for performing estimation of a parameter directly using values of the color-tone feature values gj/rj of each pixel. When estimation of a parameter is performed directly using the values of the color-tone feature values gj/rj, it is possible to reduce influences due to a magnitude of a light amount, shading, and the like, which could be a factor of exponential fluctuation of a pixel value. When estimation of a parameter is performed directly using the values of the color-tone feature values gj/rj, a series of thresholds only have to be determined as appropriate taking into account the fact that values of gj/rj in a normal endoscopic image generally fit within a range of 0≦gj/rj≦1.
In mucous membrane micro structures such as a blood vessel and a pit, it often occurs that, because of factors such as what kind of tissue characteristics the mucous membrane micro structures have and in how deep positions under a mucous membrane the mucous membrane micro structures are present, color tones of the mucous membrane micro structures are different from each other regardless of the fact that the mucous membrane micro structures belong to the same category as histologic structures. In such a case, for example, mucous membrane micro structures only have to be separated into plural categories such as a blood vessel 1 and a blood vessel 2 and a parameter only has to be estimated for each of the plural categories. Specifically, this can be realized by, after appropriately setting an integration criterion in performing integration of the representative areas Ad or a separation criterion for areas after the integration of the representative areas Ad is performed and estimating a final number of categories, estimating a parameter for each of the categories.
The series of processing explained above is not limited to processing for using the values of gj/rj as color-tone feature values and may be, for example, processing for appropriately selecting and using one of values of gj/rj or values of bj/gj as color-tone feature values corresponding to a category set as an extraction target.
In other words, according to the estimation methods explained above, it is possible to obtain an estimation result that the K categories in total are included in the schematic image data shown in
On the other hand, the arithmetic unit 41b performs an arithmetic operation related to image segmentation by applying the number of categories and the estimation results of parameters obtained in step S3 in
Φi (1≦i≦K) in Equation (1) is called distance function. A pixel at Φi=0 is a boundary of an area. Values of Φi in other pixels are set according to distances from the boundary. Specifically, concerning a pixel present on an inside of the boundary (an inner side of the area), a value of Φi takes a positive value corresponding to a distance from the boundary. Concerning a pixel present on an outside of the boundary (an outer side of the area), the value of Φi takes a negative value corresponding to a distance from the boundary.
Fα(Φ1, . . . , ΦK) in Equation (1) indicates an evaluation value obtained when an entire area Ω including all the K categories is subjected to image segmentation. In other words, it is possible to realize image segmentation of an image by calculating, through repeated processing (for example, explained later), a boundary for minimizing a value of Fα(Φ1, . . . , ΦK).
Three terms on a right side in Equation (1) respectively mean in order a term that is smaller as a parameter of a feature value in a segmented area is closer to and σi, a term that is smaller as length of a boundary line is smaller (less complicated), and a term for controlling redundancy of an area and the like.
On the other hand, in image segmentation employing the M-S method, the distance function Φi for minimizing the evaluation value Fα(Φ1, ΦK) can be calculated through repeated processing of the following Equation (2).
In Equation (2), t is a suffix indicating the number of times of repetition (0≦t) in the repeated processing. In Equation (2), t=0 means that a boundary is an initial solution.
In the repeated processing employing Equation (2), the arithmetic unit 41b acquires, as an image segmentation result that converges to a final boundary, a processing result obtained when it is detected that a value of the evaluation value Fα(Φ1, ΦK) calculated using the distance functions Φi respectively obtained when the number of times of repetition reaches t times and (t+1) times is 0 (or a very small change such as 0.01) or when it is detected that the number of times of repetition reaches a predetermined number of times of repetition (t=a predetermined value). In the present embodiment, for a reduction in an arithmetic operation time, the processing result obtained when the number of times of repetition reaches the predetermined number of times of repetition (distance functions Φi and evaluation value Fα(Φ1, ΦK) is acquired as an image segmentation result.
In Equations (1) and (2), λ indicates a predetermined constant (a positive real number). Specifically, in the present embodiment, λ is set as 1.0.
In Equations (1) and (2), ei and γi respectively indicate weighting coefficients of a first term and a second term of a right side of categories. Specifically, in the present embodiment, ei and γi are set as ei=e2= . . . =eK=1.0 and γi=γ2= . . . =γK=1.0. Further, according to the present embodiment, values of these weighting coefficients are appropriately changed, whereby significance for each of the terms of the right side or each of the categories is changed. In other words, it is possible to control a convergence result (an image segmentation result) of an area boundary.
In Equations (1) and (2), δα and Hα indicate functions obtained by respectively approximating a delta function of Dirac and a Heaviside distribution.
On the other hand, in step S4 in
Thereafter, the arithmetic unit 41b determines whether the number of times of the arithmetic operation in step S4 in
In step S5 in
In other words, according to the present embodiment, since the image segmentation processing explained above is performed using the image data generated by the image data generating unit 41a, it is possible to clearly distinguish an area where a blood vessel is present from other areas in the living tissue 101. As a result, according to the present embodiment, it is possible to improve detection accuracy of a blood vessel present in the living tissue 101 compared with the case in the past.
The image segmentation processing in the present embodiment is not limited to processing performed using an image obtained when the normal light observation mode is selected. The image segmentation processing can be performed in substantially the same manner using, for example, images obtained when the narrowband light observation mode is selected (an image corresponding to Bn light and an image corresponding to Gn light).
The image segmentation processing in the present embodiment is not limited to processing performed using, as the predetermined color-tone feature value, a value obtained by segmenting a pixel value of image data corresponding to the G light by a pixel value of image data corresponding to the R light. The image segmentation processing may be processing performed using, for example, a value obtained by segmenting a pixel value of image data corresponding to the B light by a pixel value of image data corresponding to the G light. The image segmentation processing in the present embodiment may be processing performed using, as the predetermined color-tone feature value, a pixel value of any one of image data corresponding to the R light, the G light, and the B light. Further, the image segmentation processing in the present embodiment may be processing performed using, as the predetermined color-tone feature value, a value after conversion obtained by converting pixel values of an RGB colorimetric system into an HSI colorimetric system or an L*a*b* colorimetric system.
The image segmentation processing in the present embodiment is not limited to processing using, as the predetermined color-tone feature value, only one feature value, an occurrence frequency of which conforms to a normal distribution. Plural feature values, occurrence frequencies of which respectively conform to a multivariate normal distribution, may be simultaneously used. (In this case, a variance-covariance matrix Σ only has to be used as the parameter of Equation (1) instead of a variance σi2.)
The image segmentation processing in the present embodiment is not limited to processing for performing image segmentation on the basis of only the two categories, i.e., the area where a blood vessel is present and the area equivalent to the background mucous membrane. The image segmentation processing may be processing for performing image segmentation on the basis of categories equivalent to elements that could be included in an endoscopic image such as a pit of a large intestine, MCE (Marginal Crypt Epithelium), a fur, and mucilage.
In the present embodiment, the image segmentation processing is not limited to processing performed assuming that an occurrence frequency of a color-tone feature value conforms to a normal distribution (a multivariate normal distribution). For example, the image segmentation processing may be performed by appropriately selecting and setting any one of a normal distribution and other probability distributions (other than the normal distribution) according to a distribution state of color-tone feature values in areas set as segmentation targets.
In the present embodiment, the processing for applying clustering to each pixel of an area having a large edge in order to estimate a parameter on the basis of a representative structural component is explained as an example. However, the processing is not limited to this and may be processing for directly performing clustering using feature values of all pixels including a background area.
A second embodiment of the present invention is explained next.
In the present embodiment, processing is performed using the endoscope apparatus 1 having a configuration similar to that in the first embodiment. Therefore, in the present embodiment, detailed explanation concerning the configuration of the endoscope apparatus is omitted.
Between the present embodiment and the first embodiment, the number of categories used for an arithmetic operation of Equation (1) and a method of estimating a parameter for each of the categories are mainly different. Therefore, in the present embodiment, differences from the first embodiment are mainly explained and similarities to the first embodiment are explained while omitting the explanation as appropriate.
Actions of the present embodiment are explained below.
First, after turning on a power supply for the units of the endoscope apparatus 1, a surgeon selects the normal light observation mode in the observation mode changeover switch 24. The surgeon inserts the endoscope 2 into a body cavity while looking at an image displayed on the display device 5 when the normal light observation mode is selected, i.e., an image having a color tone substantially the same as a color tone of an object seen by naked eyes to thereby bring the distal end portion 21b close to an area where the living tissue 101 as an observation target is present.
When the normal light observation mode is selected in the observation mode changeover switch 24, lights of respective colors, i.e., R light, G light, and B light are sequentially emitted from the light source device 3 to the living tissue 101. Images respectively corresponding to the lights of the respective colors are acquired in the endoscope 2.
When the image corresponding to the R light, the image corresponding to the G light, and the image corresponding to the B light are inputted, the image data generating unit 41a generates image data corresponding to the images (step S11 in
The arithmetic unit 41b calculates, on the basis of the image data generated in the image data generating unit 41a, for each of pixels of the image data, values of predetermined color-tone feature values gj/rj used for the following processing (step S12 in
Thereafter, the arithmetic unit 41b applies an EM algorithm, the number of classes of which is set to 2, in areas that does not satisfy a predetermined condition explained later among all areas present in the schematic image data shown in
The EM algorithm is a publicly-known method that can estimate plural parameters in a mixed normal distribution. Therefore, detailed explanation concerning the EM algorithm is omitted.
In the application of the EM algorithm, it is a prerequisite that the number of classes (synonymous with the number of categories) is known. Therefore, in the present embodiment, explanation is made assuming that the EM algorithm is applied to processing for estimating the number of categories present in an image and parameters of the categories by repeating classification in the two classes a sufficient number of times and integrating obtained results.
When it is a prerequisite that an occurrence frequency of a predetermined color-tone feature value (e.g., a value obtained by segmenting a pixel value of image data corresponding to the G light by a pixel value of image data corresponding to the R light) in the each pixel of the image data generated by the image data generating unit 41a conforms to a normal distribution, a mixed normal distribution obtained by combining normal distributions of two classes is, for example, as shown in
The arithmetic unit 41b repeatedly performs arithmetic operations related to an E step and an M step of the EM algorithm until an arithmetic operation stop condition is satisfied (e.g., improvement of likelihood of the two classes is equal to smaller than a threshold; specifically, the threshold only has to be set to 0.01) to thereby respectively estimate parameters (an average and a variance of predetermined color-tone feature values) for specifying a first normal distribution in which a peak of a frequency is a relatively low value and parameters (an average and a variance of predetermined color-tone feature values) for specifying a second normal distribution in which a peak of a frequency is a relatively high value (step S13 and step S14 in
Subsequently, the arithmetic unit 41b discriminates, using a normal distribution by the parameters obtained as a result of the arithmetic operation in step S14 in
According to the processing explained above, the each pixel in the schematic image data shown in
Subsequently, the arithmetic unit 41b determines whether the series of processing in steps S13 to S15 is applied again to the pixel groups classified into the respective classes (the categories) 1 and 2 obtained as a result of the processing in step S15 in
According to the first classification processing in step S15 in
Therefore, in step S16 in
On the other hand, when all the areas (all the pixel groups) obtained as the result of the processing in step S15 in
It is assumed that, according to the processing in step S17 in
In other words, according to the present embodiment, since the image segmentation processing explained above is performed using the image data generated by the image data generating unit 41a, it is possible to clearly distinguish an area where a blood vessel is present from other areas in the living tissue 101. As a result, according to the present embodiment, it is possible to improve detection accuracy of a blood vessel present in the living tissue 101 compared with the case in the past.
After the series of processing in steps S13 to S15 in
In the embodiments explained above, the arithmetic operation is performed using the estimation results of the average and the variance of the predetermined color-tone feature values as the parameters of Equation (1). However, it is also possible to perform the image segmentation processing according to a method other than this method. Specifically, for example, according to the contour detecting method disclosed in “Active Contours Without Edges” of IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 10, NO. 2 (February 2001), it is possible to obtain an image segmentation processing result substantially the same as those in the embodiments explained above by performing the arithmetic operation using only the estimation result of the average of the predetermined color-tone feature values as a parameter.
The series of processing explained in the embodiments is not limited to processing applied to an endoscopic image and may be, for example, processing applied to an image obtained by a capsule endoscope or processing applied to various medical images such as monochrome images.
In the embodiments explained above, the processing for using color-tone feature values as feature values is explained as an example. However, processing may be performed by appropriately selecting feature values other than the color-tone feature values according to structural components set as segmentation targets or appropriately combining and using the color-tone feature values and the other feature values. Specifically, as the other feature values, for example, structural feature values such as a Gabor features obtained by applying a publicly-known Gabor filter to an image may be used.
In the embodiments explained above, the processing may be performed using color-tone feature values calculated for each pixel or may be performed using color-tone feature values calculated for each small area including plural pixels (e.g., 4×4 pixels).
The present invention is not limited to the embodiments explained above. It goes without saying that various modifications and applications are possible without departing from the spirit of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2010091671 | Apr 2010 | JP | national |
This application is a continuation application of PCT/JP2011/056425 filed on Mar. 17, 2011 and claims benefit of Japanese Application No. 2010-091671 filed in Japan on Apr. 12, 2010, the entire contents of which are incorporated herein by this reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2011/056425 | Mar 2011 | US |
Child | 13197458 | US |