This invention relates generally to image processing. More particularly, in certain embodiments, the invention relates to segmentation of a sequence of colposcopic images based on measures of similarity.
It is common in the medical field to perform visual examination to diagnose disease. For example, visual examination of the cervix can discern areas where there is a suspicion of pathology. However, direct visual observation alone is often inadequate for identification of abnormalities in a tissue.
In some instances, when tissues of the cervix are examined in vivo, chemical agents such as acetic acid are applied to enhance the differences in appearance between normal and pathological areas. Aceto-whitening techniques may aid a colposcopist in the determination of areas where there is a suspicion of pathology.
However, colposcopic techniques generally require analysis by a highly trained physician. Colposcopic images may contain complex and confusing patterns. In colposcopic techniques such as aceto-whitening, analysis of a still image does not capture the patterns of change in the appearance of tissue following application of a chemical agent. These patterns of change may be complex and difficult to analyze. Current automated image analysis methods do not allow the capture of the dynamic information available in various colposcopic techniques.
Traditional image analysis methods include segmentation of individual images. Segmentation is a morphological technique that splits an image into different regions according to one or more pre-defined criteria. For example, an image may be divided into regions of similar intensity. It may therefore be possible to determine which sections of a single image have an intensity within a given range. If a given range of intensity indicates suspicion of pathology, the segmentation may be used as part of a diagnostic technique to determine which regions of an image may indicate diseased tissue.
However, standard segmentation techniques do not take into account dynamic information, such as a change of intensity over time. This kind of dynamic information is important to consider in various diagnostic techniques such as aceto-whitening colposcopy. A critical factor in discriminating between healthy and diseased tissue may be the manner in which the tissue behaves throughout a diagnostic test, not just at a given time. For example, the rate at which a tissue whitens upon application of a chemical agent may be indicative of disease. Traditional segmentation techniques do not take into account time-dependent behavior, such as rate of whitening.
The invention provides methods for relating aspects of a plurality of images of a tissue in order to obtain diagnostic information about the tissue. In particular, the invention provides methods for image segmentation across a plurality of images instead of only one image at a time. In a sense, inventive methods enable the compression of a large amount of pertinent information from a sequence of images into a single frame. An important application of methods of the invention is the analysis of a sequence of images of biological tissue in which an agent has been applied to the tissue in order to change its optical properties in a way that is indicative of the physiological state of the tissue. Diagnostic tests which have traditionally required analysis by trained medical personnel may be automatically analyzed using these methods. The invention may be used, for example, in addition to or in place of traditional analysis.
The invention provides methods of performing image segmentation using information from a sequence of images, not just from one image at a time. This is important because it allows the incorporation of time effects in image segmentation. For example, according to an embodiment of the invention, an area depicted in a sequence of images is divided into regions based on a measure of similarity of the changes those regions undergo throughout the sequence. In this way, inventive segmentation methods incorporate more information and can be more helpful, for example, in determining a characteristic of a tissue, than segmentation performed using one image at a time. The phrases “segmentation of an image” and “segmenting an image,” as used herein, may apply, for example, to dividing an image into different regions, or dividing into different regions an area in space depicted in one or more images (an image plane).
Segmentation methods of this invention allow, for example, the automated analysis of a sequence of images using complex criteria for determining a disease state which may be difficult or impossible for a human analyst to perceive by simply viewing the sequence. The invention also allows the very development of these kinds of complex criteria for determining a disease state by permitting the relation of complex behaviors of tissue samples during dynamic diagnostic tests to the known disease states of the tissue samples. Criteria may be developed using the inventive methods described herein to analyze sequences of images for dynamic diagnostic tests that are not yet in existence.
One way to relate a plurality of images to each other according to the invention is to create or use a segmentation mask that represents an image plane divided into regions that exhibit similar behavior throughout a test sequence. Another way to relate images is by creating or using graphs or other means of data representation which show mean signal intensities throughout each of a plurality of segmented regions as a function of time. Relating images may also be performed by identifying any particular area represented in an image sequence which satisfies given criteria.
In one aspect, the invention is directed to a method of relating a plurality of images of a tissue. The method includes the steps of: obtaining a plurality of images of a tissue; determining a relationship between two or more regions in each of two or more of the images; segmenting at least a subset of the two or more images based on the relationship; and relating two or more images of the subset of images based at least in part on the segmentation.
According to one embodiment, the step of obtaining a plurality of images of a tissue includes collecting an optical signal. In one embodiment, the optical signal includes fluorescence illumination from the tissue. In one embodiment, the optical signal includes reflectance, or backscatter, illumination from the tissue. In one embodiment, the tissue is illuminated by a white light source, a UV light source, or both. According to one embodiment, the step of obtaining images includes recording visual images of the tissue.
According to one embodiment, the tissue is or includes cervical tissue. In another embodiment, the tissue is one of the group consisting of epithelial tissue, colorectal tissue, skin, and uterine tissue. In one embodiment, the plurality of images being related are sequential images. In one embodiment, the chemical agent is applied to change its optical properties in a way that is indicative of the physiological state of the tissue. According to one embodiment, a chemical agent is applied to the tissue. In one embodiment, the chemical agent is selected from the group consisting of acetic acid, formic acid, propionic acid, butyric acid, Lugol's iodine, Shiller's iodine, methylene blue, toluidine blue, osmotic agents, ionic agents, and indigo carmine. In certain embodiments, the method includes filtering two or more of the images. In one embodiment, the method includes applying either or both of a temporal filter, such as a morphological filter, and a spatial filter, such as a diffusion filter.
In one embodiment, the step of determining a relationship between two or more regions in each of two or more of the images includes determining a measure of similarity between at least two of the two or more images. In one embodiment, determining the measure of similarity includes computing an N-dimensional dot product of the mean signal intensities of two of the two or more regions. In one embodiment, the two regions (of the two or more regions) are neighboring regions.
According to one embodiment, the step of relating images based on the segmentation includes determining a segmentation mask of an image plane, where two or more regions of the image plane are differentiated. In one embodiment, the step of relating images based on the segmentation includes defining one or more data series representing a characteristic of one or more associated segmented regions of the image plane. In one embodiment, this characteristic is mean signal intensity.
According to one embodiment, the step of relating images includes creating or using a segmentation mask that represents an image plane divided into regions that exhibit similar behavior throughout the plurality of images. In one embodiment, the step of relating images includes creating or using graphs or other means of data representation which show mean signal intensities throughout each of a plurality of segmented regions as a function of time. In one embodiment, the step of relating images includes identifying a particular area represented in the image sequence which satisfies given criteria.
In another aspect, the invention is directed to a method of relating a plurality of images of a tissue, where the method includes the steps of: obtaining a plurality of images of a tissue; determining a measure of similarity between two or more regions in each of two or more of the images; and relating at least a subset of the images based at least in part on the measure of similarity. In one embodiment, the step of determining a measure of similarity includes computing an N-dimensional dot product of the mean signal intensities of two of the two or more regions. In one embodiment, the two regions are neighboring regions.
In another aspect, the invention is directed to a method of determining a tissue characteristic. The method includes the steps of: obtaining a plurality of images of a tissue; determining a relationship between two or more regions in each of two or more of the images; segmenting at least a subset of the two or more images based at least in part on the relationship; and determining a characteristic of the tissue based at least in part on the segmentation.
According to one embodiment, the step of obtaining a plurality of images of a tissue includes collecting an optical signal. In one embodiment, the optical signal includes fluorescence illumination from the tissue. In one embodiment, the optical signal includes reflectance, or backscatter, illumination from the tissue. In one embodiment, the tissue is illuminated by a white light source, a UV light source, or both. According to one embodiment, the step of obtaining images includes recording visual images of the tissue.
According to one embodiment, the tissue is or includes cervical tissue. In another embodiment, the tissue is one of the group consisting of epithelial tissue, colorectal tissue, skin, and uterine tissue. In one embodiment, the plurality of images being related are sequential images. In one embodiment, the chemical agent is applied to change its optical properties in a way that is indicative of the physiological state of the tissue. According to one embodiment, a chemical agent is applied to the tissue. In one embodiment, the chemical agent is selected from the group consisting of acetic acid, formic acid, propionic acid, butyric acid, Lugol's iodine, Shiller's iodine, methylene blue, toluidine blue, osmotic agents, ionic agents, and indigo carmine. In certain embodiments, the method includes filtering two or more of the images. In one embodiment, the method includes applying either or both of a temporal filter, such as a morphological filter, and a spatial filter, such as a diffusion filter. In one embodiment, the method includes processing two or more images to compensate for a relative motion between the tissue and a detection device.
According to one embodiment, the step of determining a relationship between two or more regions in each of two or more of the images includes determining a measure of similarity between at least two of the two or more images. In certain embodiments, determining this measure of similarity includes computing an N-dimensional dot product of the mean signal intensities of two of the two or more regions. In one embodiment, the two regions are neighboring regions.
According to one embodiment, the segmenting step includes analyzing an aceto-whitening signal. In one embodiment, the segmenting step includes analyzing a variance signal. In one embodiment, the segmenting step includes determining a gradient image.
According to one embodiment, the method includes processing one or more optical signals based on the segmentation. In one embodiment, the method includes filtering at least one image based at least in part on the segmentation.
In certain embodiments, the step of determining a characteristic of the tissue includes determining one or more regions of the tissue where there is suspicion of pathology. In certain embodiments, the step of determining a characteristic of the tissue includes classifying a region of tissue as one of the following: normal squamous tissue, metaplasia, Cervical Intraepithelial Neoplasia, Grade I (CIN I), and Cervical Intraepithelial Neoplasia, Grade II or Grade III (CIN II/CIN III).
In another aspect, the invention is directed to a method of determining a characteristic of a tissue. The method includes the steps of: (a) for each of a first plurality of reference sequences of images of tissue having a first known characteristic, quantifying one or more features of each of a first plurality of mean signal intensity data series corresponding to segmented regions represented in each of the first plurality of reference sequences of images; (b) for a test sequence of images, quantifying one of more features of each of one or more mean signal intensity data series corresponding to one or more segmented regions represented in the test sequence of images; and (c) determining a characteristic of a tissue represented in the test sequence of images based at least in part on a comparison between the one or more features quantified in step (a) and the one or more features quantified in step (b).
According to one embodiment, step (c) includes repeating step (a) for each of a second plurality of reference sequences of images of tissue having a second known characteristic. In one embodiment, step (c) includes applying a classification rule based at least in part on the first plurality of reference sequences and the second plurality of reference sequences. In one embodiment, step (c) includes performing a linear discriminant analysis to determine the classification rule. In one embodiment, one of the one or more features of step (a) includes the slope of a curve at a given point fitted to one of the plurality of mean signal intensity data series. According to one embodiment, the method includes determining the segmented regions of the test sequence of images by analyzing an acetowhitening signal. In one embodiment, the first known characteristic is CIN II/CIN III and the second known characteristic is absence of CIN II/CIN III.
The objects and features of the invention can be better understood with reference to the drawings described below, and the claims. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the drawings, like numerals are used to indicate like parts throughout the various views.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the U.S. Patent and Trademark Office upon request and payment of the necessary fee.
FIGS. 2A and 2A-1 depict human cervix tissue and show an area of which a sequence of images are to be obtained according to an illustrative embodiment of the invention.
In general, the invention provides methods for image segmentation across a plurality of images. Segmentation across a plurality of images provides a much more robust analysis than segmentation in a single image. Segmentation across multiple images according to the invention allows incorporation of a temporal element (e.g., the change of tissue over time in a sequence of images) in optics-based disease diagnosis. The invention provides means to analyze changes in tissue over time in response to a treatment. It also provides the ability to increase the resolution of segmented imaging by increasing the number of images over time. This allows an additional dimension to image-based tissue analysis, which leads to increase sensitivity and specificity of analysis. The following is a detailed description of a preferred embodiment of the invention.
The schematic flow diagram 100 of
Among the key steps of the inventive embodiments discussed here are determining a measure of similarity between regions of tissue represented in a sequence of images and segmenting the images based on the measure of similarity. Much of the mathematical complexity presented in this description regards various methods of performing these key steps. As will become evident, different segmentation methods have different advantages. The segmentation techniques of the inventive embodiments discussed herein include region merging, robust region merging, clustering, watershed, and region growing techniques, as well as combinations of these techniques.
In the illustrative embodiment, images of an area of interest are taken at N time steps {t0, t1, . . . , tN-1}. In one embodiment, time t0 corresponds to the moment of application of a chemical agent to the tissue, for instance, and time tN−1 corresponds to the end of the test. In another embodiment, time to corresponds to a moment following the application of a chemical agent to the tissue. For example, let ={0, . . . , r−1}×{0, . . . , c−1} and ={n0, . . . , nN-1} be the image and time domains, respectively, where r is the number of rows and c is the number of columns. Then, r×c discrete signals w(i,j;t) may be constructed describing the evolution of some optically-detectable phenomena, such as aceto-whitening, with time. For an aceto-whitening example, the “whiteness” may be computed from RGB data of the images. There are any number of metrics which may be used to define “whiteness.” For instance, an illustrative embodiment employs an intensity component, CCIR 601, as a measure of “whiteness” of any particular pixel, defined in terms of red (R), green (G), and blue (B) intensities as follows:
I=0.299R+0.587G+0.114B. (1)
The “whitening” data is then given by w(i,j;t)=I(i,j;n), for example. Alternatively, the signal w(i,j;t) is defined in any of a multiplicity of other ways. The characterization 206 of
w(i,j;n)w(i,j;n)−w(i,j;n0),∀nε. (2)
Noise, glare, and sometimes chromatic artifacts may corrupt images in a sequence. Signal noise due to misaligned image pairs and local deformations of the tissue may be taken into account as well. Alignment functions and image restoration techniques often do not adequately reduce this type of noise. Therefore, it may be necessary to apply temporal and spatial filters.
According to the illustrative embodiment, a first filter is applied to the time axis, individually for each pixel. The images are then spatially filtered. Graph 302 of
For temporal filtering, the illustrative embodiment of the invention applies the morphological filter of Equation (3):
w(t)⊙b=½[(w∘b)•b+(w•b)∘b], (3)
where b is the structuring element, ∘ is the opening operator, and ◯ is the closing operator. According to the illustrative embodiment, the structuring element has a half circle shape. The temporally-filtered data is connected by a series of line segments 310 in the graph 302 of FIG. 3A. The noise is decreased from the series 308 to the series 310.
Illustratively, the images are then spatially filtered, for example, with either an isotropic or a Gaussian filter. A diffusion equation implemented by an illustrative isotropic filter may be expressed as Equation (4):
where ∇ is the gradient operator, Δ is the Laplacian operator, and τ is the diffusion time (distinguished from the time component of the whitening signal itself). An isotropic filter is iterative, while a Gaussian filter is an infinite impulse response (IIR) filter. The iterative filter of Equation (4) is much faster than a Gaussian filter, since the iterative filter allows for increasing smoothness by performing successive iterations. The Gaussian filter requires re-applying a more complex filter to the original image for increasing degrees of filtration. According to the illustrative embodiment, the methods of the invention perform two iterations. However, in other embodiments, the method performs one iteration or three or more iterations. The spatially-filtered data for a representative pixel is connected by a series of line segments 312 in graph 302 of
Graph 314 of
In the illustrative embodiment, the invention masks glare and chromatic artifacts from images prior to normalization. In the case of whitening data, glare may have a negative impact, since glare is visually similar to the tissue whitening that is the object of the analysis. Chromatic artifacts may have a more limited impact on the intensity of pixels and may be removed with the temporal and spatial filters described above.
Thresholding may be used to mask out glare and chromatic artifacts. In the illustrative embodiment thresholding is performed in the L*u*v* color space. Preferably, the invention also employs a correlate for hue, expressed as in Equation (5):
Since the hue h* is a periodic function, the illustrative methods of the invention rotate the u*-v* plane such that the typical reddish color of the cervix correlates to higher values of h*. This makes it possible to work with a single threshold for chromatic artifacts. The rotation is given by Equation (6):
The masks for glare and chromatic artifacts are then respectively obtained using Equations (7) and (8):
maskglare=L*>90 (7)
maskhue=h*<π/5, (8)
where L*ε[0,100] and h*ε[0,2π]. According to the illustrative embodiment, the masks are eroded to create a safety margin, such that they are slightly larger than the corrupted areas.
To reduce the amount of data to process and to improve the signal-to-noise ratio of the signals used in the segmentation techniques discussed below, illustrative methods of the invention pre-segment the image plane into grains. Illustratively, the mean grain surface is about 30 pixels. However, in other embodiments, it is between about a few pixels and about a few hundred pixels. The segmentation methods can be applied starting at either the pixel level or the grain level.
One way to “pre-segment” the image plane into grains is to segment each of the images in the sequence using a watershed transform. One goal of the watershed technique is to simplify a gray-level image by viewing it as a three-dimensional surface and by progressively “flooding” the surface from below through “holes” in the surface. In one embodiment, the third dimension is the gradient of an intensity signal over the image plane (further discussed herein below). A “hole” is located at each minimum of the surface, and areas are progressively flooded as the “water level” reaches each minimum. The flooded minima are called catchment basins, and the borders between neighboring catchment basins are called watersheds. The catchment basins determine the pre-segmented image.
Image segmentation with the watershed transform is preferably performed on the image gradient. If the watershed transform is performed on the image itself, and not the gradient, the watershed transform may obliterate important distinctions in the images. Determination of a gradient image is discussed herein below.
Segmentation is a process by which an image is split into different regions according to one or more pre-defined criteria. In certain embodiments of the invention, segmentation methods are performed using information from an entire sequence of images, not just from one image at a time. The area depicted in the sequence of images is split into regions based on a measure of similarity of the detected changes those regions undergo throughout the sequence.
Segmentation is useful in the analysis of a sequence of images such as in aceto-whitening cervical testing. In an illustrative embodiment, since the analysis of time-series data with a one-pixel resolution is not possible unless motion, artifacts, and noise are absent or can be precisely identified, segmentation is needed. Often, filtering and masking procedures are insufficient to adequately relate regions of an image based on the similarity of the signals those regions produce over a sequence of images. Therefore, the illustrative methods of the invention average time-series data over regions made up of pixels whose signals display similar behavior over time.
In the illustrative embodiment, regions of an image are segmented based at least in part upon a measure of similarity of the detected changes those regions undergo. Since a measure of similarity between regions depends on the way regions are defined, and since regions are defined based upon criteria involving the measure of similarity, the illustrative embodiment of the invention employs an iterative process for segmentation of an area into regions. In some embodiments, segmentation begins by assuming each pixel or each grain (as determined above) represents a region. These individual pixels or grains are then grouped into regions according to criteria defined by the segmentation method. These regions are then merged together to form new, larger regions, again according to criteria defined by the segmentation method.
A problem that arises when processing raw image data is its high dimension. With a typical whitening signal for a single pixel described by, for example, a sixty-or-more-dimensional vector, it is often necessary to reduce data dimensionality prior to processing. In the illustrative embodiment, the invention obtains a scalar that quantifies a leading characteristic of two vectors. More particularly, illustrative methods of the invention take the N-dimensional inner (dot) product of two vectors corresponding to two pixel coordinates. A fitting function based on this dot product is shown in Equation (9). This fitting function quantifies the similarity between the signals at two locations.
where x1 and x2 are two pixel coordinates, and Ω(x1)=<w(x1;t),w(x1;t)>is the energy of the signal at location x1.
Curve 506 in
where k⊂2 is the set of all pixels that belong to the kth region and Nk is the size of k.
Curve 508 of
where the numerator represents the N-dimensional dot product of the background-subtracted mean signal intensities 504 of region k and region l; the denominator represents the greater of the energies of the signals corresponding to regions k and l, Ω(k) and Ω(l); and −1≦φkl≦1. In this embodiment, the numerator of Equation (11) is normalized by the higher signal energy and not by the square root of the product of both energies.
In the case of whitening signals, for example, the fitting function defined by Equation (9) can be used to obtain a gradient image representing the variation of whitening values in x-y space. The gradient of an image made up of intensity signals is the approximation of the amplitude of the local gradient of signal intensity at every pixel location. The watershed transform is then applied to the gradient image. This may be done when pre-segmenting images into “grains” as discussed above, as well as when performing the hierarchical watershed segmentation approach and combined method segmentation approaches discussed below.
A gradient image representing a sequence of images is calculated for an individual image by computing the fitting value φ(i,j;io,jo) between a pixel (io, jo) and all its neighbors (i,j)εE (i
Since the best fit corresponds to a null gradient, the derivative of the fitting value is computed as in Equation (12):
Then, the derivatives of the signals are approximated as the mean of the forward and backward differences shown in Equations (14) and (15).
The norm of the gradient vector is then calculated from the approximations of Equations (14) and (15).
Since the fitting values include information from the entire sequence of images, one may obtain a gradient image which includes information from the entire sequence of images, and which, therefore, shows details not visible in all of the images. The gradient image may be used in the watershed pre-segmentation technique discussed herein above and the hierarchical watershed technique discussed herein below. Had the gradient image been obtained from a single reference image, less detail would be included, and the application of a watershed segmentation method to the gradient image would segment the image plane based on less data. However, by using a gradient image as determined from Equations (14) and (15), the invention enables a watershed segmentation technique to be applied which divides an image plane into regions based on an entire sequence of data, not just a single reference image.
Thus, the segmentation begins at step 604 of
This notation reveals the effect of normalizing using the higher energy of the two signals instead of normalizing each signal by its L2 norm. The method using Equation (16) or Equation (11) applies an additional “penalty” when both signals have different energies, and therefore, fitting values are below 1.0 when the scaled versions of the two signals are the same, but their absolute values are different.
In step 606 of
In step 609, the method recalculates fitting values for all pairs of neighboring regions containing an updated (newly merged) region. In the embodiment, Fitting values are not recalculated for pairs of neighboring regions whose regions are unchanged.
In step 610 of
Once merging is complete 612, a size rule is applied that forces each region whose size is below a given value to be merged with its best fitting neighboring region, even though the fitting value is lower than the threshold. In this way, very small regions not larger than a few pixels are avoided.
The segmentation method of
Mean signal intensity may have a negative value after background subtraction. This is evident, for example, in the first part of data series 760 of
where w(k;t) is mean signal intensity of region k as expressed in Equation (10). The merging criterion is then the energy of the standard deviation signal, computed as in Equation (18):
Segmentation using the illustrative robust region merging approach begins at step 804 of the schematic flow diagram 802 of
where k and l represent two neighboring regions to be merged. Thus, if a region can merge with more than one of its neighbors, it merges with the one that increases less the variance shown in Equation (19). Another neighbor may merge with the region in the next iteration, given that it still meets the fitting criterion with the updated region.
According to the illustrative embodiment, it is possible that a large region neighboring a small region will absorb the small region even when the regions have different signals. This results from the illustrative merging criterion being size-dependent, and the change in variance is small if the smaller region is merged into the larger region. According to a further embodiment, the methods of the invention apply an additional criterion as shown in step 807 of
In step 809 of
In step 810 of
According to the illustrative embodiment, the method observes data series 956, 958, 960, and 962 in
Let ={x1, . . . , xn}⊂d be a set of n d-dimensional vectors. An objective of clustering is to split into c subsets, called partitions, that minimize a given functional, Jm. In the case of the fuzzy c-means, this functional is given by Equation (20):
where vi is the “center” of the ith cluster, uikε[0,1] is called the fuzzy membership of xk to vi, with
and mε[1,∞] is a weighting exponent. The inventors have used m=2 in exemplary embodiments. The distance ∥•∥ is any inner product induced norm on d. The minimization of Jm as defined by Equation (20) leads to the following iterative system:
The distance, ∥xk−vj∥, is based on the fitting function given in Equation (11). If it is assumed that similar signals are at a short distance from each other, then Equation (23) results:
where φki is given by Equation (11).
Thus, in this embodiment, the segmentation begins at step 1004 of
Some embodiments have more regions than clusters, since pixels belonging to different regions with similar signals can contribute to the same cluster.
Similarly,
In step 1204 of
According to an embodiment, the method at step 1208 of
Certain embodiment methods use the hierarchical watershed to segment larger areas, such as large lesions or the background cervix. In some embodiments, the number of iterations is less than about 4 such that regions do not become too large, obscuring real details.
In some embodiments, the method performs one iteration of the hierarchical watershed, and continues merging regions using the robust region merging technique.
Other embodiments employ a “region growing technique” to performing step 110 of
One embodiment of the invention is a combined technique using the region growing algorithm, starting with a grain image, followed by performing one iteration of the hierarchical watershed technique, and then growing the selected region according to the robust region merging algorithm.
In some embodiments, the segmentation techniques discussed herein are combined in various ways. In some embodiments, the method processes and analyzes data from a sequence of images in an aceto-whitening test, for instance, using a coarse-to-fine approach. In one embodiment, a first segmentation reveals large whitening regions, called background lesions, which are then considered as regions of interest and are masked for additional segmentation.
A second segmentation step of the embodiment may outline smaller regions, called foreground lesions. Segmentation steps subsequent to the second step may also be considered. As used here, the term “lesion” does not necessarily refer to any diagnosed area, but to an area of interest, such as an area displaying a certain whitening characteristic during the sequence. From the final segmentation, regions are selected for diagnosis, preliminary or otherwise; for further analysis; or for biopsy, for example. Additionally, the segmentation information may be combined with manually drawn biopsy locations for which a pathology report may be generated. In one illustrative embodiment, the method still applies the pre-processing procedures discussed herein above before performing the multi-step segmentation techniques.
Then, a robust region merging procedure is applied, as shown in
Then, the method applies a hierarchical watershed segmentation procedure, as shown in
Then, the method applies a second clustering procedure, as shown in
Areas 1912 and 1910 in
Some embodiments of the invention for applications other than the analysis of acetowhitening tests of cervical tissue also use various inventive analysis techniques as described herein. A practitioner may customize elements of an analysis technique disclosed herein, based on the attributes of her particular application, according to embodiments of the invention. For instance, the practitioner may choose among the segmentation techniques disclosed herein, depending on the application for which she intends to practice embodiments of the inventive methods. By using the techniques described herein, it is possible to visually capture all the frames of a sequence at once and relate regions according to their signals over a period of time.
Certain embodiments of the invention methods analyze more complex behavior. Some embodiments segment the image plane of a sequence of images, then feature-extract the resulting mean intensity signals to characterize the signals of each segmented region. Examples of feature extraction procedures include any number of curve fitting techniques or functional analysis techniques used to mathematically and/or statistically describe characteristics of one or more data series. In some embodiments, these features are then used in a manual, automated, or semi-automated method for the classification of tissue.
For example, in certain embodiments, the method classifies a region of cervical tissue either as “high grade disease” tissue, which includes Cervical Intraepithelial Neoplasia II/III (CIN II/III), or as “not high grade disease” tissue, which includes normal squamous (NED—no evidence of disease), metaplasia, and CIN I tissue. The classification for a segmented region may be within a predicted degree of certainty using features extracted from the mean signal intensity curve corresponding to the region. In one embodiment, this classification is performed for each segmented region in an image plane to produce a map of regions of tissue classified as high grade disease tissue. Other embodiments make more specific classifications and distinctions between tissue characteristics, such as distinction between NED, metaplasia, and CIN I tissue.
It was desired to classify each of the segmented regions as either “indicative of high grade disease” or “not indicative of high grade disease.” Thus, an embodiment of the invention extracted specific features from each of the mean signal intensity data series depicted in the graph 2120 of
A classification algorithm was designed using results of a clinical study. In the study, mean signal intensity curves were determined using sequences of images from acetowhitening tests performed on over 200 patients. The classification algorithm may be updated according to an embodiment of the invention upon conducting further or different clinical testing. The present algorithm was based upon two feature parameters extracted from each of certain mean signal intensity data series corresponding to segmented regions of the image sequences for which biopsies were performed. These two feature parameters are as follows:
A jackknifed classification matrix linear discriminant analysis was performed on the extracted features X and Y corresponding to certain of the mean signal intensity curves from each of the clinical tests. The curves used were those corresponding to regions for which tissue biopsies were performed. From the linear discriminant analysis, it was determined that a classification algorithm using the discriminant line shown in Equation (24) results in a diagnostic sensitivity of 88% and a specificity of 88% for the separation of CIN II/III (high grade disease) from the group consisting of normal squamous (NED), metaplasia, and CIN I tissue (not high grade disease):
Y=−0.9282X−0.1348. (24)
Varying the classification model parameters by as much as 10% yields very similar model outcomes, suggesting the model features are highly stable.
While the invention has been particularly shown and described with reference to specific preferred embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.
The present application is a continuation-in-part of U.S. patent application Ser. No. 10/068,133, filed Feb. 5, 2002, which is a continuation of U.S. patent application Ser. No. 09/738,614, filed Dec. 15, 2000, which claims priority to and the benefit of U.S. Provisional Patent Application Ser. No. 60/170,972, filed Dec. 15, 1999; also, the present application claims the benefit of U.S. Provisional Patent Application Ser. No. 60/353,978, filed Jan. 31, 2002. All of the above applications are assigned to the common assignee of this application and are hereby incorporated by reference.
This invention was made with government support under Grant No. 1-R44-CA-91618-01 awarded by the U.S. Department of Health and Human Services. The government has certain rights in the invention.
Number | Name | Date | Kind |
---|---|---|---|
3013467 | Minsky | Dec 1961 | A |
3632865 | Haskell et al. | Jan 1972 | A |
3809072 | Ersek et al. | May 1974 | A |
3890462 | Limb et al. | Jun 1975 | A |
3945371 | Adelman | Mar 1976 | A |
3963019 | Quandt et al. | Jun 1976 | A |
D242393 | Bauman | Nov 1976 | S |
D242396 | Bauman | Nov 1976 | S |
D242397 | Bauman | Nov 1976 | S |
D242398 | Bauman | Nov 1976 | S |
4017192 | Rosenthal | Apr 1977 | A |
4071020 | Puglise et al. | Jan 1978 | A |
4198571 | Sheppard | Apr 1980 | A |
4218703 | Netravali et al. | Aug 1980 | A |
4254421 | Kreutel, Jr. | Mar 1981 | A |
4273110 | Groux | Jun 1981 | A |
4349510 | Kolehmainen et al. | Sep 1982 | A |
4357075 | Hunter | Nov 1982 | A |
4396579 | Schroeder et al. | Aug 1983 | A |
4397557 | Herwig et al. | Aug 1983 | A |
4515165 | Carroll | May 1985 | A |
4549229 | Nakano et al. | Oct 1985 | A |
4558462 | Horiba et al. | Dec 1985 | A |
4641352 | Fenster et al. | Feb 1987 | A |
4646722 | Silverstein et al. | Mar 1987 | A |
4662360 | O'Hara et al. | May 1987 | A |
4733063 | Kimura et al. | Mar 1988 | A |
4741326 | Sidall et al. | May 1988 | A |
4753530 | Knight et al. | Jun 1988 | A |
4755055 | Johnson et al. | Jul 1988 | A |
4768513 | Suzuki | Sep 1988 | A |
4800571 | Konishi | Jan 1989 | A |
4803049 | Hirschfeld et al. | Feb 1989 | A |
4844617 | Kelderman et al. | Jul 1989 | A |
4845352 | Benschop | Jul 1989 | A |
4852955 | Doyle et al. | Aug 1989 | A |
4877033 | Seitz, Jr. | Oct 1989 | A |
4878485 | Adair | Nov 1989 | A |
4891829 | Deckman et al. | Jan 1990 | A |
4930516 | Alfano et al. | Jun 1990 | A |
4945478 | Merickel et al. | Jul 1990 | A |
4965441 | Picard | Oct 1990 | A |
4972258 | Wolf et al. | Nov 1990 | A |
4974580 | Anapliotis | Dec 1990 | A |
4979498 | Oneda et al. | Dec 1990 | A |
4997242 | Amos | Mar 1991 | A |
5003979 | Merickel et al. | Apr 1991 | A |
5011243 | Doyle et al. | Apr 1991 | A |
5022757 | Modell | Jun 1991 | A |
5028802 | Webb et al. | Jul 1991 | A |
5032720 | White | Jul 1991 | A |
5034613 | Denk et al. | Jul 1991 | A |
5036853 | Jeffcoat et al. | Aug 1991 | A |
5042494 | Alfano | Aug 1991 | A |
5048946 | Sklar et al. | Sep 1991 | A |
5054926 | Dabbs et al. | Oct 1991 | A |
5065008 | Hakamata et al. | Nov 1991 | A |
5071246 | Blaha et al. | Dec 1991 | A |
5074306 | Green et al. | Dec 1991 | A |
5083220 | Hill | Jan 1992 | A |
5091652 | Mathies et al. | Feb 1992 | A |
5101825 | Gravenstein et al. | Apr 1992 | A |
5120953 | Harris | Jun 1992 | A |
5122653 | Ohki | Jun 1992 | A |
5132526 | Iwasaki | Jul 1992 | A |
5139025 | Lewis et al. | Aug 1992 | A |
5154166 | Chikama | Oct 1992 | A |
5159919 | Chikama | Nov 1992 | A |
5161053 | Dabbs | Nov 1992 | A |
5162641 | Fountain | Nov 1992 | A |
5162941 | Favro et al. | Nov 1992 | A |
5168157 | Kimura | Dec 1992 | A |
5192980 | Dixon et al. | Mar 1993 | A |
5193525 | Silverstein et al. | Mar 1993 | A |
RE34214 | Carlsson et al. | Apr 1993 | E |
5199431 | Kittrell et al. | Apr 1993 | A |
5201318 | Rava et al. | Apr 1993 | A |
5201908 | Jones | Apr 1993 | A |
5203328 | Samuels et al. | Apr 1993 | A |
5205291 | Potter | Apr 1993 | A |
5225671 | Fukuyama | Jul 1993 | A |
5235457 | Lichtman et al. | Aug 1993 | A |
5237984 | Williams, III et al. | Aug 1993 | A |
5239178 | Derndinger et al. | Aug 1993 | A |
5248876 | Kerstens et al. | Sep 1993 | A |
5253071 | MacKay | Oct 1993 | A |
5257617 | Takahashi | Nov 1993 | A |
5260569 | Kimura | Nov 1993 | A |
5260578 | Bliton et al. | Nov 1993 | A |
5261410 | Alfano et al. | Nov 1993 | A |
5262646 | Booker et al. | Nov 1993 | A |
5267179 | Butler et al. | Nov 1993 | A |
5274240 | Mathies et al. | Dec 1993 | A |
5284149 | Dhadwal et al. | Feb 1994 | A |
5285490 | Bunch et al. | Feb 1994 | A |
5286964 | Fountain | Feb 1994 | A |
5289274 | Kondo | Feb 1994 | A |
5294799 | Aslund et al. | Mar 1994 | A |
5296700 | Kumagai | Mar 1994 | A |
5303026 | Strobl et al. | Apr 1994 | A |
5306902 | Goodman | Apr 1994 | A |
5313567 | Civanlar et al. | May 1994 | A |
5319200 | Rosenthal et al. | Jun 1994 | A |
5321501 | Swanson et al. | Jun 1994 | A |
5324979 | Rosenthal | Jun 1994 | A |
5325846 | Szabo | Jul 1994 | A |
5329352 | Jacobsen | Jul 1994 | A |
5337734 | Saab | Aug 1994 | A |
5343038 | Nishiwaki et al. | Aug 1994 | A |
5345306 | Ichimura et al. | Sep 1994 | A |
5345941 | Rava et al. | Sep 1994 | A |
5349961 | Stoddart et al. | Sep 1994 | A |
5383874 | Jackson et al. | Jan 1995 | A |
5398685 | Wilk et al. | Mar 1995 | A |
5402768 | Adair | Apr 1995 | A |
5406939 | Bala | Apr 1995 | A |
5412563 | Cline et al. | May 1995 | A |
5413092 | Williams, III et al. | May 1995 | A |
5413108 | Alfano | May 1995 | A |
5415157 | Welcome | May 1995 | A |
5418797 | Bashkansky et al. | May 1995 | A |
5419311 | Yabe et al. | May 1995 | A |
5419323 | Kittrell et al. | May 1995 | A |
5421337 | Richards-Kortum et al. | Jun 1995 | A |
5421339 | Ramanujam et al. | Jun 1995 | A |
5424543 | Dombrowski et al. | Jun 1995 | A |
5441053 | Lodder et al. | Aug 1995 | A |
5450857 | Garfield et al. | Sep 1995 | A |
5451931 | Miller et al. | Sep 1995 | A |
5452723 | Wu et al. | Sep 1995 | A |
5458132 | Yabe et al. | Oct 1995 | A |
5458133 | Yabe et al. | Oct 1995 | A |
5467767 | Alfano et al. | Nov 1995 | A |
5469853 | Law et al. | Nov 1995 | A |
5477382 | Pernick | Dec 1995 | A |
5480775 | Ito et al. | Jan 1996 | A |
5493444 | Khoury et al. | Feb 1996 | A |
5496259 | Perkins | Mar 1996 | A |
5507295 | Skidmore | Apr 1996 | A |
5516010 | O'Hara et al. | May 1996 | A |
5519545 | Kawahara | May 1996 | A |
5529235 | Bolarski et al. | Jun 1996 | A |
5536236 | Yabe et al. | Jul 1996 | A |
5545121 | Yabe et al. | Aug 1996 | A |
5551945 | Yabe et al. | Sep 1996 | A |
5556367 | Yabe et al. | Sep 1996 | A |
5562100 | Kittrell et al. | Oct 1996 | A |
5579773 | Vo-Dinh et al. | Dec 1996 | A |
5582168 | Samuels et al. | Dec 1996 | A |
5587832 | Krause | Dec 1996 | A |
5596992 | Haaland et al. | Jan 1997 | A |
5599717 | Vo-Dinh | Feb 1997 | A |
5609560 | Ichikawa et al. | Mar 1997 | A |
5612540 | Richards-Korum et al. | Mar 1997 | A |
5623932 | Ramanujam et al. | Apr 1997 | A |
5643175 | Adair | Jul 1997 | A |
5647368 | Zeng et al. | Jul 1997 | A |
5659384 | Ina | Aug 1997 | A |
5662588 | Lida | Sep 1997 | A |
5685822 | Harhen | Nov 1997 | A |
5690106 | Bani-Hashemi et al. | Nov 1997 | A |
5693043 | Kittrell et al. | Dec 1997 | A |
5695448 | Kimura et al. | Dec 1997 | A |
5697373 | Richards-Kortum et al. | Dec 1997 | A |
5699795 | Richards-Kortum | Dec 1997 | A |
5704892 | Adair | Jan 1998 | A |
5707343 | O'Hara et al. | Jan 1998 | A |
5713364 | DeBaryshe et al. | Feb 1998 | A |
5717209 | Bigman et al. | Feb 1998 | A |
5730701 | Furukawa et al. | Mar 1998 | A |
5733244 | Yasui et al. | Mar 1998 | A |
5735276 | Lemelson et al. | Apr 1998 | A |
5746695 | Yasui et al. | May 1998 | A |
5768333 | Abdel-Mottaleb | Jun 1998 | A |
5769792 | Palcic et al. | Jun 1998 | A |
5773835 | Sinofsky et al. | Jun 1998 | A |
5784162 | Cabib et al. | Jul 1998 | A |
5791346 | Craine et al. | Aug 1998 | A |
5795632 | Buchalter | Aug 1998 | A |
5800350 | Coppleson et al. | Sep 1998 | A |
5807248 | Mills | Sep 1998 | A |
5813987 | Modell et al. | Sep 1998 | A |
5817015 | Adair | Oct 1998 | A |
5830146 | Skladnev et al. | Nov 1998 | A |
5832931 | Wachter et al. | Nov 1998 | A |
5833617 | Hayashi | Nov 1998 | A |
5838435 | Sandison | Nov 1998 | A |
5840035 | Heusmann et al. | Nov 1998 | A |
5842995 | Mahadevan-Jansen et al. | Dec 1998 | A |
5855551 | Sklandnev et al. | Jan 1999 | A |
5860913 | Yamaya et al. | Jan 1999 | A |
5863287 | Segawa | Jan 1999 | A |
5865726 | Katsurada et al. | Feb 1999 | A |
5871439 | Takahashi et al. | Feb 1999 | A |
5876329 | Harhen | Mar 1999 | A |
5894340 | Loree et al. | Apr 1999 | A |
5902246 | McHenry et al. | May 1999 | A |
5912257 | Prasad et al. | Jun 1999 | A |
5920399 | Sandison et al. | Jul 1999 | A |
5921926 | Rolland et al. | Jul 1999 | A |
5929985 | Sandison et al. | Jul 1999 | A |
5931779 | Arakaki et al. | Aug 1999 | A |
5938617 | Vo-Dinh | Aug 1999 | A |
5941834 | Skladnev et al. | Aug 1999 | A |
5983125 | Alfano et al. | Nov 1999 | A |
5987343 | Kinast | Nov 1999 | A |
5989184 | Blair | Nov 1999 | A |
5991653 | Richards-Kortum et al. | Nov 1999 | A |
5995645 | Soenksen et al. | Nov 1999 | A |
5999844 | Gombrich et al. | Dec 1999 | A |
6011596 | Burl et al. | Jan 2000 | A |
6021344 | Lui et al. | Feb 2000 | A |
6026319 | Hayashi | Feb 2000 | A |
6058322 | Nishikawa et al. | May 2000 | A |
6067371 | Gouge et al. | May 2000 | A |
6069689 | Zeng et al. | May 2000 | A |
6083487 | Biel | Jul 2000 | A |
6091985 | Alfano et al. | Jul 2000 | A |
6092722 | Heinrichs et al. | Jul 2000 | A |
6095982 | Richards-Kortum et al. | Aug 2000 | A |
6096065 | Crowley | Aug 2000 | A |
6099464 | Shimizu et al. | Aug 2000 | A |
6101408 | Craine et al. | Aug 2000 | A |
6104945 | Modell et al. | Aug 2000 | A |
6119031 | Crowley | Sep 2000 | A |
6123454 | Canfield et al. | Sep 2000 | A |
6124597 | Shehada et al. | Sep 2000 | A |
6126899 | Woudenberg et al. | Oct 2000 | A |
6135965 | Tumor et al. | Oct 2000 | A |
6146897 | Cohenford et al. | Nov 2000 | A |
6166079 | Follen et al. | Dec 2000 | A |
6169817 | Parker et al. | Jan 2001 | B1 |
6187289 | Richards-Kortum et al. | Feb 2001 | B1 |
6208887 | Clarke et al. | Mar 2001 | B1 |
6210331 | Raz | Apr 2001 | B1 |
6224256 | Bala | May 2001 | B1 |
6241662 | Richards-Kortum et al. | Jun 2001 | B1 |
6243601 | Wist | Jun 2001 | B1 |
6246471 | Jung et al. | Jun 2001 | B1 |
6246479 | Jung et al. | Jun 2001 | B1 |
6258576 | Richards-Kortum et al. | Jul 2001 | B1 |
6277067 | Blair | Aug 2001 | B1 |
6285639 | Maenza et al. | Sep 2001 | B1 |
6289236 | Koenig et al. | Sep 2001 | B1 |
6312385 | Mo et al. | Nov 2001 | B1 |
6317617 | Gilhuijs et al. | Nov 2001 | B1 |
6332092 | Deckert et al. | Dec 2001 | B1 |
D453832 | Morrell et al. | Feb 2002 | S |
D453962 | Morrell et al. | Feb 2002 | S |
D453963 | Morrell et al. | Feb 2002 | S |
D453964 | Morrell et al. | Feb 2002 | S |
6370422 | Richards-Kortum et al. | Apr 2002 | B1 |
6373998 | Thirion et al. | Apr 2002 | B2 |
6377842 | Pogue et al. | Apr 2002 | B1 |
6385484 | Nordstrom et al. | May 2002 | B2 |
6390671 | Tseng | May 2002 | B1 |
6405070 | Banerjee | Jun 2002 | B1 |
6411835 | Modell et al. | Jun 2002 | B1 |
6411838 | Nordstrom et al. | Jun 2002 | B1 |
D460821 | Morrell et al. | Jul 2002 | S |
6421553 | Costa et al. | Jul 2002 | B1 |
6424852 | Zavislan | Jul 2002 | B1 |
6427082 | Nordstrom et al. | Jul 2002 | B1 |
6465968 | Sendai | Oct 2002 | B1 |
6466687 | Uppaluri et al. | Oct 2002 | B1 |
6487440 | Deckert et al. | Nov 2002 | B2 |
6497659 | Rafert | Dec 2002 | B1 |
6571118 | Utzinger et al. | May 2003 | B1 |
6571119 | Hayashi | May 2003 | B2 |
6574502 | Hayashi | Jun 2003 | B2 |
6593101 | Richards-Kortum et al. | Jul 2003 | B2 |
6593102 | Zahniser | Jul 2003 | B2 |
6633657 | Kump et al. | Oct 2003 | B1 |
6639674 | Sokolov et al. | Oct 2003 | B2 |
6640000 | Fey et al. | Oct 2003 | B1 |
6671540 | Hochman | Dec 2003 | B1 |
6697666 | Richards-Kortum et al. | Feb 2004 | B1 |
6717668 | Treado et al. | Apr 2004 | B2 |
6760613 | Nordstrom et al. | Jul 2004 | B2 |
6766184 | Utzinger et al. | Jul 2004 | B2 |
6768918 | Zelenchuk | Jul 2004 | B2 |
6794431 | Rosania et al. | Sep 2004 | B1 |
6818903 | Schomacker et al. | Nov 2004 | B2 |
6826422 | Modell et al. | Nov 2004 | B1 |
D500134 | Banks et al. | Dec 2004 | S |
6839661 | Costa et al. | Jan 2005 | B2 |
6847490 | Nordstrom et al. | Jan 2005 | B1 |
6902935 | Kaufman et al. | Jun 2005 | B2 |
D507349 | Banks et al. | Jul 2005 | S |
6933154 | Schomacker et al. | Aug 2005 | B2 |
6975899 | Faupel et al. | Dec 2005 | B2 |
20010041843 | Modell et al. | Nov 2001 | A1 |
20020007122 | Kaufman et al. | Jan 2002 | A1 |
20020007123 | Balas | Jan 2002 | A1 |
20020107668 | Costa et al. | Aug 2002 | A1 |
20020127735 | Kaufman et al. | Sep 2002 | A1 |
20020133073 | Nordstrom et al. | Sep 2002 | A1 |
20020177777 | Nordstrom et al. | Nov 2002 | A1 |
20020183626 | Nordstrom et al. | Dec 2002 | A1 |
20020197728 | Kaufman et al. | Dec 2002 | A1 |
20030095721 | Clune et al. | May 2003 | A1 |
20030114762 | Balas et al. | Jun 2003 | A1 |
20030144585 | Kaufman et al. | Jul 2003 | A1 |
20030163049 | Balas | Aug 2003 | A1 |
20030207250 | Kaufman et al. | Nov 2003 | A1 |
20040007674 | Schomacker et al. | Jan 2004 | A1 |
20040010187 | Schomacker et al. | Jan 2004 | A1 |
20040010195 | Zelenchuk | Jan 2004 | A1 |
20040010375 | Schomacker et al. | Jan 2004 | A1 |
20040023406 | Schomacker et al. | Feb 2004 | A1 |
20040206882 | Banks et al. | Oct 2004 | A1 |
20040206913 | Costa et al. | Oct 2004 | A1 |
20040206914 | Schomacker et al. | Oct 2004 | A1 |
20040207625 | Griffin et al. | Oct 2004 | A1 |
20040208385 | Jiang | Oct 2004 | A1 |
20040208390 | Jiang et al. | Oct 2004 | A1 |
20040209237 | Flewelling et al. | Oct 2004 | A1 |
20050054936 | Balas | Mar 2005 | A1 |
20050090751 | Balas | Apr 2005 | A1 |
Number | Date | Country |
---|---|---|
196 29 646 | Jan 1988 | DE |
0 135 134 | Mar 1985 | EP |
0 280 418 | Aug 1988 | EP |
0 335 725 | Oct 1989 | EP |
0 444 689 | Sep 1991 | EP |
0 474 264 | Mar 1992 | EP |
0 641 542 | Mar 1995 | EP |
0 689 045 | Dec 1995 | EP |
0 737 849 | Oct 1996 | EP |
1246124 | Oct 2002 | EP |
1-245215 | Sep 1989 | JP |
2-17429 | Jan 1990 | JP |
5-256772 | Oct 1993 | JP |
08-280602 | Oct 1996 | JP |
1 223 092 | Apr 1986 | SU |
WO9219148 | Nov 1992 | WO |
WO9314688 | Aug 1993 | WO |
WO9426168 | Nov 1994 | WO |
WO9500067 | Jan 1995 | WO |
WO9504385 | Feb 1995 | WO |
9641152 | Dec 1996 | WO |
WO9705473 | Feb 1997 | WO |
WO9830889 | Feb 1997 | WO |
WO9748331 | Dec 1997 | WO |
WO9805253 | Feb 1998 | WO |
WO9824369 | Jun 1998 | WO |
WO9841176 | Sep 1998 | WO |
WO9918847 | Apr 1999 | WO |
WO9920313 | Apr 1999 | WO |
WO9920314 | Apr 1999 | WO |
WO9947041 | Sep 1999 | WO |
WO9957507 | Nov 1999 | WO |
WO9957529 | Nov 1999 | WO |
WO 0015101 | Mar 2000 | WO |
WO 0041615 | Jul 2000 | WO |
WO 0057361 | Sep 2000 | WO |
WO 0059366 | Oct 2000 | WO |
WO 0074556 | Dec 2000 | WO |
3063706 | Aug 2003 | WO |
04005885 | Jan 2004 | WO |
04005895 | Jan 2004 | WO |
04095359 | Nov 2004 | WO |
Number | Date | Country | |
---|---|---|---|
20030144585 A1 | Jul 2003 | US |
Number | Date | Country | |
---|---|---|---|
60353978 | Jan 2002 | US | |
60170972 | Dec 1999 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09738614 | Dec 2000 | US |
Child | 10068133 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10068133 | Feb 2002 | US |
Child | 10099881 | US |