Other objects and many of the attendant advantages of the present invention will be readily appreciated and become better understood by reference to the following detailed description when considering in connection with the accompanied drawings. Features that are substantially or functionally equal or similar will be referred to with the same reference sign(s).
According to a first possibility, the focus of the image 8 can be evaluated by means of image processing. The imaging signals provided by the camera unit are supplied to a processing unit, and a focus evaluation value is derived there from.
In
In a first operation mode, the optical measuring device 12 is used for providing measurements of the fiber optic network 16. For this purpose, a fiber 20 of the fiber optic network 16 is coupled to the connection 17, e.g. by means of a fiber connector 21. In a second operation mode, the optical measuring device 12 is used for providing a visual inspection of fibers or components of the fiber optic network 16. In this second operation mode, the imaging unit 14 provides imaging signals from the optical devices to be inspected. For this purpose, the objective 22 of the imaging unit 14 is connected to the fiber connector 21. The optical measuring device 12 can be operated in either one of the two operation modes as well as in a combined first and second operation mode that allows to concurrently perform optical measurements and visual inspection.
The processing unit 18 comprises suitable software modules for processing measurement signals provided by the measurement unit 15 as well as for processing imaging signals provided by the imaging unit 14.
In order to evaluate the image definition of the acquired image, the processing unit 18 might derive a focus evaluation value from the acquired imaging signals, e.g. by means of image processing. Preferably, only image data within a predefined region of interest (ROI) is used for determining the focus evaluation value.
If an image is well-focussed, each point of an object will correspond to a small image point in the image plane. However, if the image is out of focus, each point of the object will be transformed into a corresponding brightness distribution. With x, y denoting the coordinate values in the image plane, the brightness distribution of a point source's image can be written as:
The term x2+y2 can be replaced by r2=x2+y2. Hence, the brightness distribution can be written as
In this Gaussian distribution, the optical properties of the objective lens system are described by the parameter σ. For σ=0, the image is in focus. For large values of σ, the image will appear blurred. A large value of σ corresponds to a spread-out spatial distribution of the brightness in the image plane. The function h(r) is generally referred to as the point response of the objective lens system. As soon as the point response h(r) is known, an image can be calculated by convoluting an input signal with the point response.
According to the convolution theorem, the spatial frequency spectrum of the image can be represented as the product of the input signal's Fourier transform with a transfer function H(ρ), which is obtained as the Fourier transform of h(r)
whereby ρ denotes a spatial frequency. Multiplying the input signal's spatial frequency spectrum with the Gaussian distribution H(ρ) induces a suppression of the spatial frequency spectrum's high-frequency components. The degree of suppression of high frequency components is determined by the parameter σ. If the image is out of focus, the value of σ will be large, and the objective lens system will act as a low pass filter.
In
The spatial frequency spectrum of an image can be used for evaluating the image definition. An image that is in focus will possess a maximum amount of high-frequency components, whereas in an image that is out of focus, the high frequency components will be missing.
In the literature, a number of different functions for evaluating image definition have been described. For example, in the dissertation “Ein Beitrag zur Kamerafokussierung bei verschiedenen Anwendungen der Bildverarbeitung” bei Bingzi Liao, Universität der Bundeswehr Hamburg, July 1993, eight different functions for evaluating the definition of an image are described. This dissertation, and in particular the description of the eight different focus evaluation functions, is herewith incorporated by reference into the description of the present application.
In general, when devising a focus evaluation function, the strategy is to determine a measure of the spatial frequency spectrum's high-frequency part.
A first strategy is to determine a two-dimensional discrete Fourier transform G(u, v) of the image g(x, y) and to sum up or integrate the high-frequency part of the corresponding power spectrum |G(u, v)|2.
The two-dimensional discrete Fourier transform of g(x, y) can be determined as
However, the calculation of a two-dimensional Fourier transform is computationally expensive. Therefore, it is advantageous to determine G(u, v) according to a sequence of a first and a second one-dimensional Fourier transforms. For an N×N image, a one-dimensional discrete Fourier transform of the N columns is determined as:
Next, a one-dimensional discrete Fourier transform of the N rows is performed in accordance with:
Preferably, a one-dimensional FFT (Fast Fourier Transform) algorithm is employed for each of the one-dimensional Fourier transforms. It goes without saying that the order of performing the two one-dimensional Fourier transforms related to the image's rows and columns can be interchanged.
Once G(u, v) has been determined, a focus evaluation function LS can be obtained as
whereby ψ denotes the high-frequency region of the two-dimensional spatial frequency plane.
The right hand side of
In contrast, on the right hand side of
Another possibility is to use the gradient of the image g(x, y) as a starting point for deriving a focus evaluation function. The Fourier transform of a gradient operator can be written as:
Applying a gradient operator to g(x, y) is equivalent to multiplying the spatial frequency spectrum with the respective spatial frequency u or v. Accordingly, applying a gradient operator to g(x, y) lifts the high-frequency part of the spatial frequency spectrum. The summed-up absolute values of the image's gradient can therefore be taken as a measure of the image's high frequency components. For example, in a blurred image, there do not exist any sharp transitions, and for this reason, the absolute value of the gradient remains relatively small.
For discrete values g(x, y), the gradient can be approximated by the corresponding difference quotients:
For evaluating the focus of the image, the absolute values of the difference quotients are summed up. Thus, the so-called “Sum Modulus Difference” (SMD) is obtained, which is a measure for the image's absolute gradient. The Sum Modulus Difference (SMD) can be determined in three different ways. For example, for an N×M image, the Sum Modulus Difference SMD1, which is determined as
extracts the gradient along the x-axis. Correspondingly, the Sum Modulus Difference SMD2
extracts the gradient along the y-axis. For considering both the contributions of gradients in the x- and y-direction, it is advantageous to determine a Sum Modulus Difference SMD3 that is based on the gradient's absolute values:
In the solutions that have been described so far, the focus evaluation value has been derived by processing the acquired image data. Another possibility is to add additional hardware to the optical imaging unit, with said hardware being adapted for measuring a distance between the optical imaging unit and an optical element's surface, in order to generate a focus evaluation signal. Preferably, the optical imaging unit is equipped with a triangulation unit.
A solution of this kind is shown in
Another alternative solution for determining the distance between the optical imaging unit 25 and the optical element 26 is to analyze, by means of image processing, the position of a light spot of the light beam 32 on the optical element's surface. The light beam 32 is directed towards the fiber's surface at a predefined angle of incidence. The position of the light spot on the fiber's surface can be used for deriving the relative distance between the optical imaging unit 28 and the fiber's surface.
The focus evaluation value that has been determined by one of the above described techniques can be indicated to the user as a focussing aid. In this semi-automatic approach, the user is responsible for manually adjusting the focus of the optical imaging unit. For example, the focus evaluation value can be converted into corresponding figures or symbols that are displayed to the user. Alternatively, the focus evaluation value can be converted into an acoustic signal, or into a tactile feedback signal. For example, the frequency of a focus evaluation tone can be varied in accordance with the instantaneous image definition. While listening to the tone, the user can adjust the focus until the highest possible (or lowest possible) frequency is reached.
Alternatively, the optical imaging unit can be provided with an autofocus unit adapted for automatically adjusting the focus. For this purpose, the optical imaging unit might be equipped with an actuator for electromechanically varying the distance between the optical imaging unit and the optical element, or for adjusting the focus of the objective lens system.
For detecting faults of an optical fiber connection, it is advantageous to use a visual inspection tool like the one shown in
A visual inspection tool like the one shown in
Furthermore, the optical imaging unit 40 can be equipped with a signal light detection unit. In case a signal light component is received via the fiber 41, the presence of this signal light component can be detected and indicated to the user. In particular, the user can be informed about the presence of non-visible light components, e.g. of signal light components in the infrared.
Furthermore, the optical imaging device might comprise a cleaning facility that allows to remove dirt or fluid films (such as oil films) that contaminate the optical element's surface. For example, the fiber's surface can be cleaned by means of an air jet that is directed to the fiber's surface. In this embodiment, it has to be made sure that the air jet is oil-free. It might therefore be advantageous to utilize a compressor unit comprising an oil interceptor. Another possibility is to provide means for immersing the fiber's surface into an ultrasonic cleaning facility. Alternatively or additionally, the optical imaging unit might be equipped with one or more brushes adapted for mechanically cleaning the optical element's surface.