Feature adapted beamlet transform apparatus and associated methodology of detecting curvilinear objects of an image

Information

  • Patent Grant
  • 8103122
  • Patent Number
    8,103,122
  • Date Filed
    Monday, April 14, 2008
    16 years ago
  • Date Issued
    Tuesday, January 24, 2012
    12 years ago
Abstract
A method of detecting a curvilinear object of a noisy image. The method includes filtering the noisy image in accordance with a two dimensional line profile. The line profile is selected within a class of steerable filters. A beamlet coefficient is calculated in accordance with the filtering, wherein a coefficient above a predetermined threshold identifies a local feature.
Description
BACKGROUND

The claimed advancements described herein relate to a system and associated methodology of detecting curvilinear objects in an image. More specifically, an apparatus and associated methodology are provided for performing a feature adapted Beamlet transform for the detection of curvilinear objects in a noisy image via a software and/or hardware implementation.


In image processing systems computer vision applications and the like, the detection of curvilinear objects is often times desired. Such objects occur in every natural or synthetic image as contours of objects, roads in aerial linear imaging or DNA filaments in microscopy applications. Currently, there is no known methodology in which a steerable filter may be leveraged to employ line segment processing methodologies such as beamlet methods for representing curvilinear objects carrying a specific line-profile.


Curvilinear objects are considered as 1 dimensional manifolds that have a specific profile running along a smooth curve. The shape of this profile may be an edge or a ridge-like feature. It can also be represented by more complex designed features. For example, in the context of DNA filament analysis in fluorescent microscopy, it is acceptable to consider the transverse dimension of a filament to be small relative to the PSF (point spread function) width of the microscope. Hence, the shape of the profile may be accurately approximate by a PSF model.


One way to detect curvilinear objects is to track locally the feature of the curve-profile; linear filtering or template matched filtering are well-known techniques for doing so. Classical Canny edge detector and more recently designed detectors are based on such linear filtering techniques. They involve the computation of inner-products with shifted and/or rotated versions of the feature template at every point in the image. High response at a given position in the image means that the considered area has a similarity with the feature template. Filtering is usually followed by a non-maxima suppression and a thresholding step in order to extract the objects. The major drawbacks of such approaches come from the fact that linear filtering is based on local operators. Hence it is highly sensitive to noise but not sensitive to the underlying smoothness of the curve, which is a typical non-local property of curvilinear objects.


Alternatively, the Radon transform is a powerful non-local technique which may be used for line detection. Also known as the Hough transform in the case of discrete binary images, it performs a mapping from the image space into a line parameter space by computing line integrals. Formally, given an image f defined on a sub-space of R2, for every line parameter (ρ, θ), it computes










φ


(

ρ
,
ϑ

)







-









-







f


(

x
,
y

)




δ


(

ρ
-

x






cos


(
θ
)



-

y






sin


(
θ
)




)









x









y

.








(
1
)








Peaks in the parameter space reveals potential lines of interest. This is a very reliable method for detecting lines in noisy images. However, there are several limitations. First, direct extension of that method to detect more complex curves is unfeasible in practice for it increases the complexity exponentially by adding one dimension to the parameter space. In addition, Radon transform computes line integrals on lines that pass through the whole image domain and does not provide information on small line segments.


Given an image of N×N pixels, the number of possible line segments defined is in O(N4). Direct evaluation of line integrals upon the whole set of segments is practically infeasible due to the computational burden. One of the methodologies proposed to address this problem is the Beamlet transform. It defines a set of dyadically organized line segments occupying a range of dyadic locations and scales, and spanning a full range of orientations. This system of line segments, called beamlets, have both their end-points lying on dyadic squares that are obtained by recursive partitioning of the image domain. The collection of beamlets has a O(N2 log(N)) cardinality. The underlying idea of the Beamlet transform is to compute line integrals only on this smaller set, which is an efficient substitute of the entire set of segments for it can approximate any segment by a finite chain of beamlets. Beamlet chaining technique also provides an easy way to approximate piecewise constant curves.


Formally, given a beamlet b=(x, y, l, θ) centered at position (x,y), with a length l and an orientation θ, the coefficient of b computed by the Beamlet transform is given by










Φ


(

f
,
b

)


=





-
1

/
2


1
/
2





f


(


x
+

γcos


(
θ
)



,

y
+

γsin


(
θ
)




)










γ

.







(
2
)







Equation (2) is closely related to equation (1) since Beamlet transform can be viewed as a multiscale Radon transform; they both integrate image intensity along line segments. However, they do not take into account any line-profile. It implies that the Radon and Beamlet transforms are not well-adapted to represent curvilinear objects carrying a specific line-profile.


Accordingly, a feature-adapted Beamlet transform is provided to represent curvilinear objects of a specific line profile.


SUMMARY OF EXEMPLARY ASPECTS OF THE ADVANCEMENTS

In one aspect, a method of detecting a curvilinear object of a noisy image is provided. The method includes filtering the noisy image in accordance with a two dimensional line profile. The line profile is selected within a class of steerable filters. A beamlet coefficient is calculated in accordance with the filtering, wherein a coefficient above a predetermined threshold identifies a local feature.


In a further aspect, a method of detecting a curvilinear object of a noisy image is disclosed. The method includes filtering the noisy image in accordance with a two dimensional line profile. The line profile is selected within a class of steerable filters. A beamlet coefficient is calculated in accordance with the filtering, wherein a coefficient above a predetermined threshold identifies a local feature. The noisy image is convolved in accordance with a number of basic filters.


In still a further aspect of the invention, a method of detecting a curvilinear object of a noisy image is disclosed. The method includes filtering the noisy image in accordance with a two dimensional line profile. The line profile is selected within a class of steerable filters. A beamlet coefficient is calculated in accordance with the filtering, wherein a coefficient above a predetermined threshold identifies a local feature. The noisy image is convolved in accordance with a number of basic filters and each image is computed by linear combination.


It is to be understood that both the foregoing general description of the invention and the following detailed description are exemplary, but are not restrictive, of the invention.





BRIEF DESCRIPTION OF THE DRAWINGS

The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fees.


A more complete appreciation of the invention and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:



FIG. 1 illustrates a high level block diagram of a feature-adapted Beamlet transform in accordance with an exemplary aspect of the disclosure;



FIG. 2 illustrates an example of an original image before noise corruption;



FIG. 3 illustrates the original image of FIG. 2 corrupted with noise;



FIG. 4 illustrates the use of 3rd order edge detection of curvilinear objects as applied to the noisy image in FIG. 3;



FIG. 5 illustrates detection using feature adapted beamlet transform carrying the same 3rd order filter, as applied to the noisy image in FIG. 3, in accordance with an exemplary aspect of the disclosure;



FIG. 6 illustrates an example or an original image of DNA filaments obtained by fluorescent microscopy;



FIG. 7 illustrates the use of 2nd order edge detection of curvilinear objects using standard beamlet transform, as applied to the original image in FIG. 6; and



FIG. 8 illustrates detection using feature adapted beamlet transform carrying the same 2nd order filter, as applied to the original image in FIG. 6, in accordance with an exemplary aspect of the disclosure.





DETAILED DESCRIPTION

A feature adapted beamlet transform apparatus and associated methodology of detecting curvilinear objects of an image is provided to unify the Beamlet transform with a linear filtering technique to introduce the Feature-adpated Beamlet Transform, which is able to incorporate knowledge of a desired line-profile running along curves. If the profile is designed as a steerable filter, this methodology leads to an efficient implementation.



FIG. 1 shows the Feature-adapted Beamlet transform block diagram in accordance with an exemplary embodiment. A typical input image 2 is shown for processing by the feature adapted Beamlet transform system, generally designated 5. The front end of the system includes a bank of a dedicated basis filters 4, which convolve the image as input. The output of the basis filters 4 is transformed with a Beamlet transform 6. In this regard, it is noted that the basis filters 4 and the Beamlet transform 6 are equally applicable to remotely distributed nodes, or, stand alone systems. For example, those skilled in the art will recognize that such processing may be computed on independent systems. In such cases, the Beamlet transform may be employed by a stand alone system as a local utility for facilitating image recognition. As such, it is to be understood that basis filters 4 and the Beamlet 6 transform may correspond to separate devices or separate aspects of a same device in accordance with the advancements described herein. Likewise, components 6 and 8, although illustrated as separate objects, are in exemplary embodiments described herein, components of a feature adapted Beamlet transform apparatus.


The output from the Beamlet transform 6 is multiplied by a set of gain maps 8, which apply the appropriate interpolation functions at each position and time. The summation junction 10 produces the adaptively filtered and transformed image 12.


The system 5 of FIG. 1 may embrace a personal computing device such as a PC employing an Intel Pentium processor. The instruction set described in detail below is provided as a utility application executing in conjunction with a local processor and operating system such as Microsoft VISTA®, Unix, Solaris, Linux, Apple MAC-OS and other systems known to those skilled in the art. Alternatively, those skilled in the art will recognize the applicability to mobile devices such as PDAs, phones, and portable entertainment devices which employ Symbian, Microsoft Mobile® and like operating systems. Memory required for supporting the registries, kernel and like features of FIG. 1 is omitted as well known. Likewise the description of general features of FIG. 1 such as local volatile and/or non-volatile memory, I/O capabilities, common peripheral devices, as well as corresponding functionality have been omitted for brevity, the detailed operation/description of which is well known to those skilled in the art. The specific coding and porting of the algorithms described herein is within the ability of one skilled in the art upon review of this specification.


It is understood that within the scope of the appended claims, the inventions may be practiced otherwise than as specifically described herein. For example, while described in terms of both software and hardware components interactively cooperating, it is contemplated that the feature adapted beamlet transform described herein may be practiced entirely in software, firmware, or as an Application Specific Integrated Circuit (ASIC).


As recognized by those skilled in the art, software and firmware may be embodied on a computer readable storage medium such as an optical disc or semiconductor memory.


Moreover, the feature adapted beamlet transform may be implemented as a web based utility or a web based service invoked remotely by a known protocol such as SOAP. For example, it is envisioned that the feature adapted beamlet transform may be leveraged in a research environment where images are provided to the utility via a network for servicing a group of users. Likewise, remote devices may be employed to access the feature adapted beamlet transform via a number of wireless protocols such as BLUETOOTH® and I.E.E.E.802-11x wireless formats.


All steps relative to a single basis filter can be simultaneously computed on a parallel machine. All these steps have a O(N2) complexity. In this scheme, the evaluation of beamlet coefficients consumes most of the computation time. To increase speed a cache strategy may be employed to pre-compute most of the computation while utilizing an approximation of beamlet coefficients based on the two-scale recursion technique. This arrangement increases speed at the expense of a memory load. For a 1024×1024 image, an implementation of the standard Beamlet transform takes approximately 1 s on a dual processors based computer.


Consider a filter h representing a 2-dimensional line-profile. Let hθ be a rotated version of h in the direction θ:

hθ(x,y)=h(Rθ(x,y)),  (3)

where Rθ is the 2-dimensional rotation matrix of angle θ. In a first step, consider filtering image f2 with hθ before computing the beamlet coefficient from equation (2):










Φ


(

f
,
b

)


=





-
1

/
2


1
/
2





f


(


x
+

γcos


(
θ
)



,

y
+

γsin


(
θ
)




)










γ

.







(
2
)








This yields:










Ψ


(

f
,
b

)


=





-
1

/
2


1
/
2




f
*


h
θ



(


x
+

γcos


(
θ
)



,

y
+

γsin


(
θ
)




)










γ

.







(
4
)







A high coefficient identifies that the local feature runs significantly along b. This is the Feature-adapted Beamlet transform 6. In general, the computation of all beamlet coefficients is not conceivable, since it requires to convolve the image as many times as the number of θ's. For the special case where h is selected to be within the class of steerable filters, consider writing hθ as a linear combination of basis filters 4:












h
θ



(

x
,
y

)


=




j
=
1

M





k
j



(
θ
)





h

θ
j




(

x
,
y

)





,




(
5
)







where kj's 8 are interpolation functions that only depend on θ. The basis filters hθj's 4 are independent of θ. A convolution of an image with a steerable filter of arbitrary orientation is then equal to a finite weighted sum of convolution of the same image with the basis filters. Hence, equation (4) can be written as













Ψ


(

f
,
b

)


=






j
=
1

M





k
j



(
θ
)








-
1

/
2


1
/
2






f

θ
j




(


x
+

γcos


(
θ
)



,

y
+

γsin


(
θ
)




)









γ












=






j
=
1

M





k
j



(
θ
)




Φ


(


f

θ
j


,
b

)





,







(
6
)








where fθj=f*hθj and Φ(fθj,b) corresponds to the beamlet coefficient of b computed over fθj using equation (2). As a result, in order to compute equation (4) for every beamlet coefficient, consider the following: first convolve the image as many times as the number of basis filters composing our filter h. This number is typically very small. On each filtered image, compute the standard Beamlet transform. Finally, for each beamlet, compute its coefficient using equation (6).


A detection method using the Feature-adpated Beamlet transform 10 provides a list of beamlets that best represent curvilinear objects carrying a specific line-profile in an image. The method is based on a multiscale coefficient thresholding technique.


A Recursive Dyadic Partition (RDP) of the image domain is any partition, starting from the whole image domain, obtained by recursively choosing between replacing any square of the partition by its decomposition into four dyadic squares or leaving it unsplit. This concept is very similar to the quadtree decomposition technique. A beamlet-decorated RDP (BD-RDP) is a RDP in which terminal nodes of the partition are associated with at most one beamlet. By construction, BD-RDP provides a list of non-overlapping beamlets. In order to select the list of beamlets that best represent curvilinear objects in the image 2, maximize over all beamlet-decorated recursive dyadic partitions P={S1, S2, . . . , Sn} the following complexity penalized residual sum of square:












E


(
P
)


=





S

P




C
S
2


-


λ
2


#

P



,




where








C
s

=


max

b

S





FBT


(

f
,
b

)



l








(
7
)








measures the energy required to model the region S of the image f by the beamlet b and λ is a MDL-like criteria that controls the complexity of the model. A high value of λ yields to a coarse representation of curvilinear structures; a small value leads to a quite complex model with potentially a significant number of false alarms. Equation (7) can be solved very efficiently by a recursive tree-pruning algorithm due to additivity of the cost function.


Consider comparing this methodology with a linear filtering technique which convolves the image with a steerable filter and resolves for each image point a polynomial equation in order to find the optimal orientation maximizing the filter response. This step is followed by a non-maxima suppression and a thresholding step. Exemplary steerable filters are a combination of Gaussian-based filters which are optimized under Canny-like criteria, such as a 3rd order filter. FIG. 3 shows results on a noisy image corrupted by Gaussian white noise with standard deviation σnoise=50. In the methods described herein, the well-known Bresenham algorithm is used to highlight pixels traversed by meaningful beamlets. In both cases, the threshold value is determined to keep 2,000 pixels. As shown in FIG. 5, the number of false positives is highly reduced.


Consider evaluating the performance of the Feature-adapted Beamlet transform 10 compared to the standard Beamlet transform for the detection of multiple lines segments in noisy images. Two techniques are tested on images of DNA filaments obtained by fluorescent microscopy. These filaments have a ridge-like profile. For the choice of h, we choose a 2nd order filter. The same algorithm is used for both transforms with λ=100. The standard Beamlet transform behaves like a low-pass filter and hence, is sensitive to the background intensity, as opposed to the Feature-adpated Beamlet transform 10 which can cancel constant or more complex background, depending on the vanishing moments of h. In the example shown herein, in order to get these two transforms comparable between each other, the background is assumed to be constant and is substracted from the image before computing the beamlet coefficients. To do so, the background mean intensity is estimated from the median of the image 2. As shown in the top left corner of FIG. 6, the spurious detections are due to the fact that real background is not constant over the whole image domain. As can be seen in FIG. 7, this is not the case for the exemplary feature adapted Beamlet transform.


Thus, the foregoing discussion discloses and describes merely exemplary embodiment of the present invention. As will be understood by those skilled in the art, the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting of the scope of the invention, as well as other claims. The disclosure, including any readily discernible variants of the teachings herein, define, in part, the scope of the foregoing claim terminology.

Claims
  • 1. A method of detecting a curvilinear object of a noisy image, comprising: filtering the noisy image in accordance with a two dimensional line profile, the line profile being selected within a class of steerable filters; andcalculating a beamlet coefficient in accordance with the filtering wherein a coefficient above a predetermined threshold identifies a local feature.
  • 2. The method according to claim 1 further comprising: convolving a noisy image in accordance with a number of basis filters.
  • 3. The method according to claim 2 wherein a filtered image is computed with a beamlet transform application.
  • 4. The method according to claim 3 wherein each image is computed by linear combination.
  • 5. A computer readable storage medium including encoded computer program instructions that causes a computer to detect a curvilinear object of a noisy image, comprising: filtering the noisy image in accordance with a two dimensional line-profile, the line profile being selected within a class of steerable filers; andcalculating a beamlet coefficient in accordance with the filtering wherein a coefficient above a predetermined threshold identifies a local feature.
  • 6. The medium according to claim 5 further comprising: convolving a noisy image in accordance with a number of basis filters.
  • 7. The medium according to claim 5 wherein a filtered image is computed with a beamlet transform application.
  • 8. The medium according to claim 5 wherein each image is computed by linear combination.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of the earlier filing date of U.S. Provisional Application No. 60/911,797, filed Apr. 13, 2007, entitled “A Feature Adapted Beamlet Transform Apparatus and Associated Methodology of Detecting Curvilinear Objects of an Image” the entirety of which is incorporated herein by reference.

US Referenced Citations (1)
Number Name Date Kind
5828491 Neuman et al. Oct 1998 A
Related Publications (1)
Number Date Country
20090175541 A1 Jul 2009 US
Provisional Applications (1)
Number Date Country
60911797 Apr 2007 US