The present invention relates generally to the domain of imaging systems and methods.
The following references belong to the technical background of the present invention:
Endoscopes are the common medical instrumentation to perform medical inspection of internal organs. There are two main types of endoscopes: flexible and rigid. The flexible endoscopes are being constructed out of a bundle of single mode fibers while each fiber in the bundle transmits backwards spatial information corresponding to a single spatial point, i.e. a single pixel. The fibers bundle may go into the body while the imaging camera is located outside. Interface optics adapts the photonic information coming out of the bundle to the detection camera. The reason for using single mode fiber for each fiber in the bundle rather than multi mode fibers (capable of transmitting spatial information that is corresponding to plurality of pixels) is related to the fact that when inserting the endoscope and while navigating it inside the body it may be bent. When multi mode fibers are bent the spatial modes are coupled to each other and the image is strongly distorted. The typical diameter of a single mode fiber in the bundle is about 30 μm (this is the diameter of its cladding, the core has diameter of about 8-9 μm). The typical number of fibers in the bundle is about 10,000-30,000. Typical overall diameter (of the entire bundle) is about 3 mm-5 mm.
Another type of endoscopes is called rigid endoscope. In this case the camera going inside the body of the patient rather than staying outside while it is located on the edge of a rigid stick. Although image quality of rigid endoscopes is usually better and they allow not only backwards transmission of images but also other medical treatment procedures, their main disadvantage is related to the fact that they are indeed rigid and thus less flexible and less adapted for in-body navigation procedures.
There are alternative solutions to endoscopy which for instance involve pills swallowed by the patient and capable of capturing images of internal organs while propagated through the stomach and the intestine.
Multi core fibers were also proven to be suitable to perform high quality imaging tasks. In Refs. [2-4] one may see an overview of the state of the art of in vivo fluorescence imaging with high resolution micro lenses. In Refs. [5-7] one may see the demonstration of this micro endoscope for in vivo brain fluorescence imaging application. The use of multicore fibers might be preferred in invasive applications as it minimizes damage due to the small diameter of such an instrument.
For example, an endoscope utilizing a multicore fiber is described in US Patent Application US-A1-2010/0046897 which discloses an endoscope system including an image fiber with an image fiber main body made of a plurality of cores for forming pixels and a cladding common thereto; and an optical system connected to an eyepiece side of the image fiber for causing laser light to enter the image fiber and for taking in an image from the image fiber, in which the image fiber has the cores arranged substantially uniformly over a cross-section of the image fiber main body, the cross-section being perpendicular to a longitudinal direction of the image fiber main body.
The use of multicore fibers is advantageous in various applications, including medical applications, because of their small size and a possibility of making the instrument desirably flexible, if needed. However, multicore fiber based imaging faces issues related to limited resolution and/or field of view when used for imaging in near field or in far field conditions.
The present invention provides with a novel imaging system and method enabling to overcome these limitations. The imaging system of the invention includes a multicore fiber forming or being part of an optical imaging unit for imaging an object on a detection array. The multicore fiber by its input edge faces an object plane and by its output edge faces the detector plane. The multicore fiber thus collects light from the object at the input edge and transfers collected light to the output edge. Further provided in the imaging system is a displacing unit which operates to provide at least a lateral shift of the input edge of the multicore fiber relatively to the object, i.e. in a plane substantially perpendicular to the optical axis of the optical imaging unit. By this, a set of shifted images of the object can be sequentially obtained at the detector array. The displacing unit is controllably operated by an operating unit which sets a shifting amplitude to be either a first amplitude inferior or equal to the diameter of a core in the multicore fiber or a second amplitude superior or equal to the diameter of the multicore fiber.
As will be described further below, when imaging using a near field mode, the spatial resolution might be sufficiently high, however, the field of view is generally limited. By laterally shifting the multicore fiber with the second shifting amplitude, the field of view in the combined image formed by multiple successively acquired images, can be improved. On the other hand, when imaging an object using a far field mode, the field of view might be sufficient but the spatial resolution is typically limited. This can be solved, or at least partially solved, by using the lateral shift of the multicore fiber with the first shifting amplitude.
The invention provides for utilizing both the lateral shift of the multicore fiber, as described above, and also a longitudinal shift of the system operation between the far and near field modes. The latter can be implemented by either shifting the input edge of the multicore fiber itself, or alternatively (or additionally) using an imaging optics movable along the optical axis of the optical imaging unit. Practically, it would be easier to locate such a lens between the multicore fiber and the detector array, but generally a movable lens may be between the object and the multicore fiber.
Thus, according to one broad aspect of the present invention, there is provided a system for imaging an object comprising an optical imaging unit defining an optical axis and for imaging an object on a detection array, the optical imaging unit comprising a multicore fiber configured to collect light from the object at an input edge of the multicore fiber and transfer collected light to an output edge of the multicore fiber; a displacing unit configured to shift the input edge of the multicore fiber relatively to the object in a plane substantially perpendicular to the optical axis to obtain a set of shifted images of the object; and an operating unit configured to operate the displacing unit by setting a shifting amplitude to either a first amplitude inferior or equal to the diameter of a core of the multicore fiber or a second amplitude superior or equal to the diameter of the multicore fiber.
In some embodiment, the optical imaging unit comprises an optical assembly configured to collect light from the output edge of the multicore fiber and form an image of the object on the detection array.
In some embodiment, the optical imaging unit is configured for selectively operating in either one of a near field and far field imaging modes.
In some embodiment, the optical imaging unit comprises a lens unit arranged upstream of the input edge of the multicore fiber with respect to a direction of light propagation through the system, said lens unit being displaceable along the optical axis with respect to the object.
In some embodiment, the system further comprises a processing unit connectable to the detection array and configured to receive and process data indicative of the set of shifted images for obtaining a combined image of the object by interlacing one or more of said shifted images, said combined image having the improved resolution and/or field of view.
In some embodiment, the system further comprises a detection unit configured to monitor a distance between the object and the input edge of the multicore fiber to thereby enable the operation of the optical imaging unit in either one of the near field and far field imaging modes.
In some embodiment, the system further comprises a display unit for displaying at least one of the images.
In some embodiment, the operating unit comprises an input utility configured to receive input from a user defining whether the field of view or resolution of the imaging is to be improved.
In some embodiments, the operating unit further comprises a communication utility to communicate with the detection unit, and a shift controller for setting the shifting amplitude based on a distance between the object and the input edge of the multicore fiber and on whether a field of view or a resolution of the original image is to be improved.
In some embodiment, the operating unit comprises a communication utility to communicate with the input unit.
In some embodiment, the multicore fiber is either a fiber bundle or a photonic crystal.
In some embodiment, the multicore fiber has a polygonal cross section defining two opposite substantially parallel facets.
In some embodiment, the cross section of the multicore fiber is rectangular.
In some embodiment, the system further comprises electrodes located on said opposite facets of the multicore fiber to carry out at least one of electrical stimulation and sensing temperature using Peltier effect.
In some embodiment, the system further comprises a coherent light source configured to illuminate the object and provide a reference wave front; and an holographic or interferometric setup configured to provide interference between the reference wave front and a reflected wave front reflected by the object and transferred by the multicore fiber, thereby providing phase information on the light reflected by the object.
According to another broad aspect of the invention, there is provided a method for imaging an object comprising: transferring light coming from the object through a multicore fiber having an input edge and an output edge; imaging the object on a detection array by collecting light from the output edge of the multicore fiber; setting a shifting amplitude for multicore fiber, such that the shifting amplitude is either a first amplitude inferior or equal to the diameter of a core of the multicore fiber or a second amplitude superior or equal to the diameter of the multicore fiber, to enable improvement of either resolution or field of view of imaging; shifting the input edge of the multicore fiber using said shifting amplitude, thereby obtaining a set of shifted images of the object; and processing said set of shifted images in order to obtain a combined image of the object by interlacing said shifted images for improving the resolution or field of view.
In some embodiment, the method further comprises selectively imaging the object in either one of a near field and far field imaging modes.
In some embodiment, the method comprises moving a lens unit, in front of the input edge of the multicore fiber, along said axis of light propagation, with respect to the object.
In order to understand the invention and to see how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
The present invention proposes an imaging system that includes a multicore fiber (also referred to as a probe) containing tens of thousands of cores properly separated to avoid optical leakage between them even when bended (e.g. when navigated through a body). The structure of the probe allows performing resolution enhancement, i.e. super resolution based on shifts of an input tip of the probe. The structure of the probe also enables to perform field of view enhancement based on shifts of the input tip. Further, the optical cores of the multicore fiber act to transmit backwards a wave front and to generate an image, however one or more of the cores may also be used to illuminate the object itself or even to heat the object if illuminated with high photonic power density. Furthermore, illuminating the object with coherent light such as laser may allow extraction of 3D information by interference configuration near the detection plane in which not only the amplitude but also the phase of the reflected wave front can be estimated For example, an active reference beam at the detector array plane may be interfered with a wave front reflected by the object and transferred through the multicore fiber, thereby enabling to obtain phase information on the wave front reflected by the object. The phase information may enable to obtain 3D information on the object i.e. to build a profile of the object. In another embodiment, the interference configuration may be replaced by an holographic setup.
The probe allows realization of an optical operation equivalent to optical zooming, i.e. reducing the field of view and increasing the sampling resolution. This operation may be obtained by axially shifting an optical assembly (i.e. moving from the far field regime where we have large field of view and lower resolution into the near field approximation where we have good resolution and small field of view).
The cross section of the probe may be rectangular thereby allowing to coat two of its opposite faces with metals to realize electrical stimulation capability at its edge including heating/cooling or thermal sensing based upon the Pelletier effect when two different types of metals are used for the coating.
The proposed probe can be used for large variety of new biomedical applications in which its thin diameter allows noninvasive medical operability. The applications may involve navigation through blood artery, going through the tears holes into internal chambers in the nose and the head, performing navigation through lambs especially those of small children (having smaller channels) and performing prostate as well as womb related medical treatments.
The optical imaging unit 2 defines an optical axis X and comprises a multicore fiber 20 (also referred to as a probe) and an optical assembly 30. The multicore fiber 20 may transfer light arriving from the object 3 to an input edge 21 of the multicore fiber 20 toward an output edge 22 of the multicore fiber 20. The optical assembly 30 may be configured to collect light at the output edge 22 of the multicore fiber 20 and to focus the light collected on the detector array 4. The optical assembly 30 may be configured to form images of the object 3 either positioned in the far field or in the near field of the imaging unit 2 i.e. relatively far or relatively close of the input edge 21. In an embodiment, the optical assembly 30 may be adaptable with regard to the position of the object 3 to be imaged on the detector array 4. In an embodiment, the optical assembly 30 may be an imaging lens positioned between the output edge 22 and the detector array 4. In said embodiment, the imaging lens may be longitudinally displaceable along the optical axis X so as to accommodate light from an object 3 positioned either in the far field or in the near field. In another embodiment, the optical assembly 30 may comprise an input lens positioned upstream of the input edge 21. The diameter of a core and the diameter of the multicore fiber 20 may be respectively referred to as d and D. The values of d and D are defined by fabrication and application related limitations. For example, D may be smaller than 300 μm in order to remain non invasive in certain medical applications. The value of d may be determined according to a desired spatial resolution. If D is equal to 300 μm and one wishes to have 100×100 pixels resolution it means that d may be about 3 μm. Generally, d may be larger than an optical wavelength of the light collected in order to allow coupling of light to the fiber with sufficient energetic efficiency.
The imaging system 1 may further comprise a detection unit 40, an optical assembly controller 50, a displacing unit 60, an operating unit 70 and a processing unit 80. The detection unit 40 may be configured to monitor the position of the object 3 with regard to the input edge 21. For example, the detection unit 40 may determine a longitudinal distance between the input edge 21 of the multicore fiber 20 and the object 3 to determine whether the object 3 is in near field or in far field. The detection unit 40 may comprise a detection communication utility to transmit data indicative of the longitudinal distance between the input edge 21 of the multicore fiber 20 and the object 3 to the optical assembly controller 50 and/or to operating unit 70. The optical assembly controller 50 may comprise a controller communication utility to communicate with the detection unit 40 so as to receive data from the detection unit 40 indicative of the longitudinal distance between the input edge 21 of the multicore fiber 20 and the object 3. The optical assembly controller 50 may be configured to adapt the optical assembly 30 based on said data so as to focus light emitted by the object 3 on the detector array 4. In other words, the optical assembly controller 50 may be configured to operate the optical assembly 30 so that the light output from the multicore fiber 20 forms an image on the detector array 4 according to a near field or far field configuration of the object 3. In the embodiment previously mentioned in which the optical assembly 30 is the imaging lens, the optical assembly adapting unit 40 may be configured to displace the imaging lens longitudinally along the optical axis X to focus light transferred by the multicore fiber 20 on the detector array 4. The optical assembly controller 50 may comprise a processor configured to process said data indicative of the longitudinal distance between the input edge 21 of the multicore fiber 20 and the object 3 so as to determine a configuration of the optical assembly 30 for imaging the object 3 according to conjugation relations. In the embodiment in which the optical assembly 30 is the imaging lens, since in respect to its imaging related property, the multicore fiber 20 may actually be regarded as if the input and output edges 21, 22 of the multicore fiber 20 act similarly to principle planes of an lens, the position of the optical assembly 30 may be determined according to the following relation:
wherein U1 is the distance between the object 3 and the input edge 21 of the multicore fiber 20, U2 is the distance between the output edge 22 of the multicore fiber 20, V is the distance between an optical center of optical assembly 30 and the detection array 4 and F is the focal length of the optical assembly 30.
The displacing unit 60 (or probe displacing unit) may be configured to shift the input edge 21 of the multicore fiber 20 relatively to the object 3. In an embodiment, the shift may be performed in a plane substantially perpendicular to the optical axis X (so-called “lateral shift”) in order to form a set of shifted images of the object 3 on the detector array 4. Alternatively, or preferably additionally, the shift may be performed along the optical axis (so-called “longitudinal shift”). This may be implemented by the same displacing unit moving the input edge 21 of the multicore fiber 20 along both axis, or additional displacement unit associated with a lens upstream or downstream of the probe.
The probe displacing unit 60 may comprise displacing communication utility configured to receive shifting amplitude instructions and/or shifting direction instructions from the operating unit 70 thereby enabling the operating unit 70 to operate the displacing unit 60. The operating unit 70 may further comprise a operating communication utility to communicate with the detection unit 40 so as to receive data from the detection unit 40 indicative of the longitudinal distance between the input edge 21 of the multicore fiber and the object 3. The operating communication utility may further be configured to receive indications from an input utility (not shown) on whether a resolution or a field of view of the imaging is to be improved. The operating unit 70 may further comprise a shift controller configured to set a shifting amplitude to either a first amplitude inferior or equal to the diameter of a core of the multicore fiber 20 or a second amplitude superior or equal to the diameter of the multicore fiber 20. The setting of the amplitude may for example be based on the distance between the input edge 21 and the object 3 (i.e. a near field mode or far field mode) and on the input from a user received through said input utility.
The processing unit 80 may be connectable to the detector array 4 and configured to acquire and process the set of shifted images formed on the detector array 4 by interlacing said set of shifted images thereby obtaining a combined image of a better resolution or field of view. The operating unit 70 may provide the shifting amplitude set to the processing unit 80. For near field super resolving used in the present invention, the image processing of interlaced set of shifted images may be performed according to the spatial masking technique disclosed in references [9, 10]. For far field super resolving, the earlier image processing technique developed by the inventor and described in may be used.
Therefore, the present system provides a compact and ergonomic solution simple to manufacture and able to provide images with a high resolution. Further, a user may rely on the operating unit to perform the shifting with a limited amount of manual operations.
In the near field mode and to improve resolution, given a predetermined super resolving factor K (K being an integer) to be achieved, the shifting amplitude may be set to the first amplitude. The first amplitude may be determined by the following relation:
A
1
=d/K,
wherein d is the diameter of the core of the fiber. Further, the number of shifts performed may be equal to the super resolving factor.
In the far field mode and to improve resolution, given the predetermined super resolution factor, the shifting amplitude may be set to the second amplitude. The second amplitude may be determined by the following relation:
A
2
=D,
wherein D is the diameter of the multicore fiber. Further, the number of shifts may be equal to the super resolving factor.
In order to increase the field of view (and not the resolution) in near field, the shifting may be performed with the second shifting amplitude A2. Particularly, in order to increase the field of view by a factor of K one may perform K shifts with the second shifting amplitude A2. In the far field, in order to increase the field of view by a factor of K one may perform K shifts with the first shifting amplitude A1.
The position of the object may be detected in order to determine if the object is positioned in the near field or in the far field of the imaging unit. For example, the longitudinal distance between an input edge of the multicore fiber and the object may be detected by a sensor. In an embodiment, the shifting amplitude is set to the second amplitude when (a) the object is in the far field and the resolution is to be improved, and/or (b) the object is in the near field and the field of view to be improved. In an embodiment, the shifting amplitude is set to the first amplitude when (c) the object is in the near field and the resolution is to be improved, and/or (d) the object is in the far field and the field of view is to be improved. Therefore, the same shifting amplitude may be used for different purposes i.e. enhancing resolution or field of view when imaging an object in different modes i.e. in far field and near field mode. In a fourth step S104, the multicore fiber input edge may be shifted of said shifting amplitude in order to obtain a set of shifted images. In a fifth step S105, the set of shifted images may be processed to obtain a combined image which attains the improvement defined i.e. a better resolution or field of view than the image obtained in step S101 (also referred to as the original image).
The implementation of the processing of the set of shifted images and the setting of the shifting amplitude based on the relative position of the object with regard to the multicore fiber and on the imaging improvement to attain in the original image may be better understood considering the following:
Any imaging system has limited capability to discriminate between two spatially adjacent features. The physical factors that limit this capability can be divided into two types. The first type is related to the effect of diffraction of light being propagated from the object towards the imaging sensor [8]. The resolution limit due to diffraction as it is obtained in the image plane equals to:
δx=1.22λF#
where λ is the optical wavelength and F# denotes the F number which is the ratio between the focal length and the diameter of the imaging lens.
The second type is related to the geometry of the detection array [8, 9]. The geometrical limitation can be divided into two kinds of limitations. The first is related to the pitch of the sampling pixels i.e. the distance between two adjacent pixels. This distance determines, according to the Nyquist sampling theorem, the maximal spatial frequency that can be recovered due to spectral aliasing (generated when signals are under sampled in the space domain):
δpitch=½νmax=1/BW
where δpitch is the pitch between adjacent pixels, νmax is the maximal spatial frequency that may be recovered and BW is the bandwidth of the spectrum of the sampled image. The second kind is related to the shape of each pixel and to the fact that each pixel is not a delta function and thus it realizes a non ideal spatial sampling.
In fact, the type of resolution reduction that is being imposed by the multi core probe depends on the distance between the edge of the probe and the object (previously denoted U1).
Diffraction resolution reduction is obtained when the input plane of the probe is relatively away from the object (far field approximation) and then the light distribution on this plane resembles the light distribution over the imaging lens aperture. In that case the diameter of the fiber D sets the maximal spatial frequency transmitted by the fiber and therefore also the spatial resolution obtainable in the image plane:
δx=λV/D
and the fact that there are multiple cores is equivalent to sampling in the Fourier plane which means replication in the image plane yielding limiting restriction over the obtainable field of view:
Δx=λV/d
where Δx is the obtainable field of view in the image plane and d is the pitch between two adjacent cores in the multi core probe.
The geometrical limitation is obtained when the distance between the fiber and the object (U1) is relatively small (near field approximation) and then the field of view is limited by the diameter of the fiber D while the pitch between two cores d determine the spatial sampling resolution:
Δx=MD and δx=Md
where M is the demagnification factor of the proposed imaging system and it equals to:
M=V/(U1+U2)
The imaging method of the present invention selectively overcomes the geometrical limitation or the diffraction limitation based on detecting whether the object is in near field or in far field and on accordingly setting appropriate shifting amplitude to obtain a set of shifted images thereby enabling to conduct super resolution processing. In super resolution the idea is to encode the spatial information that could not be imaged with the optical system into some other domain. Transmit it through the system and to decode it [8]. The most common domain to do so is the time domain.
Therefore, a way for obtaining resolution improvement in the proposed configuration can be as follows: in the case of far field arrangement when the limiting factor is related to diffraction, the fiber itself can be shifted in time. This time scanning operation will be equivalent to generation of a synthetically increased aperture similar to what happens in synthetic aperture radars (SAR). In this scanning operation the resolution improvement factor is proportional to the ratio between the scanned region and the diameter of the probe D. If instead of super resolution one wishes to increase the imaging field of view, the probe needs to be shifted at amplitude of less than d in order to generate over sampling of spectrum domain by its multiple cores. In this case a set of images are captured while each is obtained after performing a shift of sub core distance. Then, all the images are interlaced together accordingly to generate effective sub core sampling. In the case of near field approximation, temporal scanning once again can improve the resolving capability as described in Refs. [9,10]. In this case the shift is limited by the size of d. Once again a set of images are captured while each is obtained after performing a shift of sub core distance. Then, all the images are interlaced together accordingly to generate effective sub core sampling. In case that instead of resolution improvement one wishes to obtain an increase in the imaging field of view, the probe can again perform scanning but this time at larger amplitude. The field of view enlargement is proportional to the ratio between the shift amplitude and the diameter of the probe D.
The above examples and description have of course been provided only for the purpose of illustration, and are not intended to limit the invention in any way. As will be appreciated by the skilled person, the invention can be carried out in a great variety of ways, employing more than one technique from those described above, all without exceeding the scope of the invention.
This application is Continuation of U.S. application Ser. No. 17/569,555 filed on Jan. 6, 2022, which is a Continuation of U.S. application Ser. No. 16/992,891 filed on Aug. 13, 2020, which is a Continuation of U.S. application Ser. No. 13/978,423 filed on Jul. 5, 2013 which is a National Phase application of PCT International Application No. PCT/IL2012/050004, International Filing Date Jan. 5, 2012, claiming priority of U.S. Provisional Application No. 61/457,116, filed Jan. 5, 2011, all of which are hereby incorporated by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
61457116 | Jan 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17569555 | Jan 2022 | US |
Child | 18444957 | US | |
Parent | 16992891 | Aug 2020 | US |
Child | 17569555 | US | |
Parent | 13978423 | Jul 2013 | US |
Child | 16992891 | US |