This application claims priority to German Patent Application No. 102023130062.8 filed Oct. 31, 2023, and entitled “Imaging Device for a Distal End Section of an Endoscope, Objective System, and Endoscope,” which is incorporated herein by reference.
The present invention relates to an imaging device for a distal end section of an endoscope suitable for white light and fluorescence emission imaging.
Endoscopes for use in medical or non-medical applications may utilize both white light imaging and fluorescence imaging. Endoscopic instruments intended for industrial use, rather than medical use, are often referred to as borescopes. As this invention relates to both medical endoscopes and borescopes the term “endoscope” is used to generally include both instruments. Conventional endoscopes able to capture both white light and fluorescence images with a single optical path and only a single image sensor do so by collecting alternating frames with staggered white light and excitation illumination. Such a system arrangement results in a frame rate that is significantly lower than a frame rate of a system using only visible white light, typically reaching, at most, half of the frame rate of a white light only system. The low frame rate may lead to unwanted motion blurring. In addition to the necessary shuttering between white light and fluorescence frames, the sensitivity of the fluorescence frames is limited by the shared optical path, which typically results in the inability to attenuate one signal relative to the other signal. Image processing subsequently analyzes and combines the images from alternating frames into a fluorescence/white light overlay with a low overall frame rate. This technique results in a lower sensitivity in the fluorescence range as well as a reduced brightness and resolution of the overlaid image compared to a white light only image.
An alternative to using only one image sensor, shuttering the frames, and overlaying the white light and fluorescence images, would be to place two complete objective systems in an endoscope for capturing white light and fluorescence images separately. However, due to the space requirements of two objective systems in parallel, not to mention the added complexity of orientation, increased expense, and loss of image brightness, this is rarely a feasible solution for chip-on-the-tip (COTT) endoscopes, where the respective electronic image sensor or sensors is/are arranged in the distal end section, e.g., the tip of the narrow shaft of the endoscope.
Instead of using two complete objective systems, it is also possible to use just one objective system and split the beams afterwards by means of a beam splitter for directing the split beams onto two different image sensors. However, it is not feasable to miniaturize the conventional beam splitter and the two image sensors into the small space provided by COTT endoscopes. For example, splitting an afocal beam path including both white and fluorescence light by means of a 45° beam splitter has the disadvantage, besides providing the necessary afocal beam path itself, that a subsequent focusing is required. Alternatively, if using a conventional beam splitter in the converging beam path in front of the image sensors, aberrations occur on the sensor that result in the transmission and the image quality decreasing. Overall, this results in large imaging devices due to the necessary refocusing.
US 2020/0088579 A1 discloses a hybrid spectral imager comprising multiband filtering optics with a beam divider means for generating at least two replica images of a target image and a tunable multiband filtering means interposed into the imaging path and effecting a tunable multiband pass filtering in the image replicas. The beam divider can be a pentaprism, wherein one of its five surfaces is coated with a polychroic mirror substrate, and this coated surface is cemented together with a triangle prism for making the prism's rear surface parallel to the front surface of the pentaprism. Furthermore, the pentaprism comprises a tilting actuator to adjust the tilting angle of the prism assembly and therewith the direction of the imaging ray beams. The images of a target object are focused by an objective lens optics and divided by the tilting prism assembly onto two image sensors, wherein, at each tilting step, a new set of images is obtained simultaneously in a snapshot mode, so that, in a scanning mode, the entire spectrum can be scanned to create a complete spectral cube for the target object. Hereby, for adapting to a variety of sizes of sensor arrays, the path lengths of the two split beams have to be substantially equal. Consequently, the arrangement of this hybrid spectral imager in a narrow shaft of a COTT endoscope is limited.
In US 2014/0225992 A1 a minimally invasive surgical system with an image capture unit including a prism assembly and a sensor assembly is described. The image capture unit includes a shared lens assembly followed by a sensor assembly, wherein the sensor assembly includes a prism assembly, a reflective unit and co-planar image capture sensors. Hereby, the two image capture sensors having similar image sizes are symmetric across a plane that intersects the longitudinal axis of the stereoscopic endoscope and only a small gap is arranged between the respective surface of the prism assembly and each co-planar image capture sensor.
An imaging device is presented for a distal end section of an endoscope. The imaging device comprises a pentaprism with a main body and a prism wedge and also includes an optical axis, and an entrance surface, a second internal reflective surface and a first exit surface arranged on the main body and a second exit surface arranged on the prism wedge. The pentaprism further comprises a dichroic beam splitting layer as a first internal reflective and transmissive splitting layer. At least a first image sensor for capturing light of a first spectral region and a second image sensor for capturing light of a second spectral region are assignable to the imaging device. The first spectral region and the second spectral region differ at least partially from each other, and the dichroic beam splitting layer is arranged between the main body and the prism wedge of the pentaprism, so that incident beams comprising the first and second spectral regions are split by the dichroic beam splitting layer into first beams of the first spectral region and second beams of the second spectral region. A least one separate optical element is arranged between the pentaprism and the first image sensor and/or the second image sensor for changing a property of the light of the first spectral region and/or the second spectral region, so that two optimized separate images may be captured by the first image sensor and the second image sensor for superimposed imaging of the light of the first spectral region and the second spectral region.
The imaging device may be utilized in chip-on-the-tip endoscopes, enabling the improved simultaneous capturing of images of two different spectral regions by two separate image sensors within the restricted space available in the shaft of this kind of endoscope. Consequently, an increase of the overall frame rate and an improved image quality can be achieved in a small, constrained space in a shaft of a video endoscope. Due to the specific design of the pentaprism and the at least one separate optical element arranged between the pentaprism and the respective image sensor or sensors, the first and second split beams can be further manipulated and/or corrected independently from each other for superimposed imaging of the light of both spectral regions at the best available imaging quality. Therewith, at least two optical well-corrected, separate images can be captured in a miniaturized design of the imaging device that can be or is arranged into or in the small space available in shafts of COTT endoscopes.
Due to the specific pentaprism design, an afocal beam bundle does not have to be provided by an objective lens system upstream of the imaging device, where, due to the construction and space requirements in the narrow shaft of a COTT endoscope, it is not possible to generate an afocal beam bundle. Instead, the pentaprism can be arranged in a converging beam path and is designed such that both image sensors simultaneously receive the image from the object space without aberrations.
Furthermore, due to the specific geometric and space-saving design of the pentaprism, a further improvement of the relative beam path is enabled by the at least one separate optical element arranged between the pentaprism and the respective sensor, wherein the respective separate optical element can be present only in one of both respective paths or similar or different optical elements can be arranged in both respective paths. Therewith, the sensitivity, image brightness and/or resolution of a defined spectral region can be improved without necessarily influencing the optical parameters of the other spectral region to be captured by the respective image sensor.
One of the key innovations of the present invention is the special design of the pentaprism with a prism wedge in combination with a downstream separate optical element arranged between the pentaprism and at least one of the respective two image sensors for providing the simultaneous capture of two images at a high frame rate and a high sensitivity. Additionally, the imaging device is able to be arranged in the small, limited space of a shaft of a COTT endoscope. Therewith, both image sensors can be used with a full frame rate and/or with specially adapted image size and/or properties. Therewith, a space-saving geometry of the imaging device due to the special design of the pentaprism in combination with the downstream arranged separate optical element is realized which allows an easy fitting of the imaging device and/or the assignable image sensors inside the shaft of an endoscope in the longitudinal and radial direction of the shaft and enabling a simultaneously improved capture of separate images in a first spectral region and a second spectral region, each by a separate, dedicated image sensor, resulting in adapted and/or improved image quality. Due to the design of the pentaprism in combination with the separate optical element, a frame rate of 60 frames per second or even more can be achieved.
In a further embodiment of the imaging device, the separate optical element and/or a further separate optical element is or are a lens for demagnification of the light of the first spectral region and/or the second spectral region. It should be noted that the separate optical element may include multiple optical elements.
Even though a demagnification lens can be set in both the first split beams of the first spectral region and the second split beams of the second spectral region downstream of the pentaprism, it is especially advantageous to arrange one demagnification lens between the pentaprism and a fluorescence image sensor, by which the relative intensity (per unit area of the sensor) of the fluorescence image is increased at the expense of resolution. While, in fluorescence imaging, the resolution overall is not a limiting parameter, the increased fluorescence intensity significantly improves the signal-to-noise ratio at the fluorescence image sensor and therewith improves the overall imaging quality. By decreasing the image size by means of a demagnification element, such as a lens, the light sensitivity of the first and/or second spectral region can be selectively increased. Consequently, a smaller shaped sensor can be used providing more light intensity in the respective spectral region. The ability to use a smaller shaped image sensor is especially advantageous for an image sensor arranged perpendicular to the longitudinal direction of the shaft due to the size limitation caused by the inner shaft diameter. Furthermore, due to the demagnification element, aberrations during image capturing by the respective image sensor are inhibited.
For providing a variable focal length, the separate optical element and/or a further separate optical element can be moveable.
Therewith, by at least one moveable optical element, the image detail on the image plane of the respective image sensor can be changed and adapted for different image sections and/or different applications. For moving the at least one separate optical element, for example a linear motor and/or a piezoelectric element can be used. Likewise, the moveable optical element can be connected by a tension wire to the operator control at the handle of an endoscope. The separate optical element may be moveable along the longitudinal axis of the shaft, for example.
In a further embodiment of the imaging device, the separate optical element and/or a further separate optical element comprises a mirror element for redirecting the first beams of the first spectral region and/or the second beams of the second spectral region.
By arranging a separate optical element comprising a mirror element after the pentaprism, the respective spectral region not redirected by the pentaprism itself can be redirected by the mirror element. Therefore, for example, both image sensors can be arranged with their image planes parallel to the longitudinal axis of the pentaprism and/or the shaft. Consequently, neither of the two image sensors must be arranged in the cross-section of the shaft, perpendicular to the longitudinal axis.
For enabling fluorescence imaging and/or capturing only certain wavelengths of the respective spectral region, the at least one separate optical element and/or a further separate optical element can be a filter for blocking an excitation wavelength or for discriminating further the first spectral region and/or second spectral region.
Therefore, in fluorescence imaging, the filter as a fluorescence filter can block the excitation wavelength from reaching the respective detecting image sensor, and therefore the captured fluorescence light comprises only the light emitted by an applied fluorophore in the object field. Likewise, the wavelength range of the split first or second spectral region can be further modified by a filter, for example and edge or band filter, for discriminating certain wavelengths.
In a further embodiment of the imaging device, the dichroic beam splitting layer is arranged on a surface of the main body adjacent to a surface of the prism wedge or on a surface of the prism wedge adjacent to a surface of the main body.
By arranging the dichroic beam splitting layer on a respective surface of the main body or the prism wedge, a compact and space-saving beam splitting pentaprism is provided which is small enough to be arranged in the shaft and/or the distal end section of the shaft of an COTT endoscope.
For integrating the image capturing into the imaging device, the image device comprises the first image sensor and/or the second image sensor.
Thus, maintaining the compact design of the pentaprism and the imaging device, the first image sensor and/or the second image sensor can be arranged outside of the pentaprism and/or following the at least one optical element in a radial direction from the longitudinal axis of the shaft or perpendicular to the longitudinal axis. Consequently, the first image sensor and/or the second image sensor can likewise be arranged in the shaft and/or its distal end section. Hereby, both image sensors for capturing the first beams with the first spectral region and the second beams with the second spectral region can be arranged at a respective distance to the pentaprism and/or to the at least one optical element such that the optical pathlengths of both paths have the same value or a different value. By the same value of the optical pathlengths, an in-focus image is acquired for both the first and second beams.
In a further embodiment, the first image sensor and the second image sensor are arranged perpendicular to each other or both parallel to the optical axis of the imaging device.
Therewith, the first image sensor and the second image sensor can be arranged as required, depending on each respective inner diameter of a shaft of an endoscope.
For enabling fluorescence imaging, a light source is assignable to the imaging device, and the light source comprises a first illumination spectral region comprising white light and a second excitation spectrum region comprising excitation light, so that a fluorophore within an illuminated scene is caused to emit fluorescent light.
Therewith, despite employing just one optical path for the incoming beams before the pentaprism, by means of the beam splitting pentaprism and therewith the imaging device, white light and fluorescence images can be collected simultaneously and separately by the first image sensor dedicated to white light imaging in the visible light range and the second image sensor dedicated to fluorescence imaging. Consequently, a high overall frame rate in the overlay mode and increased sensitivity for fluorescence imaging can be obtained while maintaining the white light brightness and a high fluorescence light intensity in a chip-on-the-tip endoscope, as compared to conventional endoscopes shuttering between white light and fluorescence frames.
In a further embodiment of the imaging device, the second image sensor for capturing the second spectral region comprising fluorescent light has a smaller image area than the first image sensor for capturing the first spectral region comprising white light.
In a further aspect of the invention, the problem is solved by an objective system for an endoscope, wherein the objective system is arrangeable in a distal end section of an elongate shaft of the endoscope and at least a first image sensor for capturing images of a first spectral region and a second image sensor for capturing images of a second spectral region are arrangeable in the elongate shaft, wherein the objective system comprises an objective lens system with at least a first lens, at least a second lens, and optional further lenses in order from an objective side to receive image light and to pass the image light towards the first image sensor and the second image sensor, wherein the objective system comprises a previously described imaging device, wherein the imaging device is arranged between a most proximal lens of the objective lens system and the first image sensor and/or the second image sensor.
Thus, an objective system for an endoscope is provided with only one optical path for the two spectral regions, preferably for white light and fluorescence light imaging, wherein, after the lens system, the one optical path is split by the pentaprism into a separate first optical path for the first beams within the first spectral region and a separate second optical path for the second beams split in the second spectral region, allowing the simultaneous capture of images associated with each of the first and second spectral regions by separate dedicated image sensors.
In a further embodiment of the objective system, beams entering in and/or exiting from the pentaprism are focal and/or converging.
Consequently, an afocal beam path is not necessary in the objective system and a post-focusing, after passing through the pentaprism, likewise does not have to be applied. Despite using a focal and/or converging beam path, due to the design of the pentaprism and the subsequently arranged at least one separate optical element, commonly occurring aberrations in system known from the state of the art do not occur in the inventive objective system and an overall improved image quality is provided.
Another aspect of the invention presented includes an endoscope, in particular a medical or industrial video endoscope, with a handle, an elongate shaft, a light source, an objective system and/or an imaging processing unit and/or a display system, wherein the endoscope comprises a previously described imaging device or the objective system is a previously described objective system, so that images of the first spectral region are captured by the first image sensor and of the second spectral region are captured by the second image sensor separately in parallel and/or are displayable as superimposed images by the display system.
The invention is further explained by the following exemplary descriptions of particular embodiments.
As utilized in accordance with the present disclosure, the following terms, unless otherwise indicated, shall be understood to have the following meanings.
An “endoscope,” more particularly a video endoscope, is an endoscope with a means for digital image acquisition in and/or at the distal end of the elongate shaft, and the transmission of data therefrom, for example, to the proximal end of the endoscope. The endoscope comprises an elongate shaft and a handle that are connectable to each other. In the present invention, at least two digital image sensors are located in the elongate shaft or at the distal end of the elongate shaft for image acquisition. The particular video endoscope may be any kind of digital endoscope, for example a 2D colonoscope, a laparoscope or gastroenteroscope or a 3D video endoscope. The endoscope may be a chip-on-the-tip endoscope (COTT).
The “elongate shaft” is, in most implementations, a rigid, semi-flexible or flexible tube. The shaft may be configured for being inserted into a cavity, for example that of a human or animal body, to be viewed endoscopically. In industrial applications, the endoscope, or borescope, shaft will be placed into an element such as a pipe or another area which is difficult to access directly, such as behind a wall. Generally, the shaft may have an outer diameter in the range of 4 mm to 10 mm. Besides the objective system, the imaging device, and two or more image sensors, the shaft may include one or more channels for irrigation or passing through working instruments (generally referred to as “working channels”) in order to achieve the desired effect in the cavity or opening. The shaft can be detachably connected at its proximal end to a handle or be permanently connected thereto. The distal end section of the elongate shaft is the section remote from the user, while the proximal end section of the shaft is closer to the user.
An “objective system” is an optical system which includes an objective lens system to receive, pass forward and/or modify the image light from an object space.
An objective lens system includes, for example, in an order from an object side, a cover glass and/or a first lens, a second lens, and optional further lenses, which are arranged along an optical axis of the lens system. Optionally, one or more optical filters can be located between any two lenses of the lens system.
A “lens” is a transmissive optical body that focusses or disperses light beams (light rays) by means of refraction. The first lens, the second lens and optional further lenses can be single lenses, which are separated by an air gap, or are in contact to adjacent lenses at most pointwise. Also, a lens can be a combined, compound, lens and/or a rod lens. Preferably, the lenses are made of glass and/or a crystalline material, however this is not limiting.
An “imaging device” is, for example, a device which further modifies the image light passed forward by the objective system and divides the image light to the at least two image sensors. The imaging device at least comprises the pentaprism, at least one separate optical element, and at least two image sensors for capturing the image are assignable or are included. The imaging device does not necessarily have to be arranged in the distal end section of the shaft close to the objective system but can be located more proximally at a distance to the objective system in the distal end section.
A “pentaprism” is an optical component in form of a geometric body prism with five sides. The pentaprism comprises a main body and a prism wedge together forming the five reflecting, transmitting and/or non-beam passing sides of the pentaprism in a longitudinal section view. The pentaprism can be used for different optical effects. The optical properties of the pentaprism generally depend on the angles and/or the position of the optically effective prism surface in relation to one another and on the refractive indexes of the material of the main body and the prism wedge. In particular, the main body and the prism wedge comprise glass, and preferably the glass of the main body and the prism wedge are selected to have the same, or nearly the same, refractive index. For example, the refractive indexes of the main body and the prism wedge can be both 1.6. In a sectional view parallel to the travel direction of the incident light and therewith in a longitudinal section, the pentaprism has a substantial pentagon shape. In some preferred embodiments, the pentaprism has a length in the longitudinal direction and therewith along the optical axis of less than 5.0 mm. In some embodiments, the pentaprism has a diameter of less than 3.5 mm perpendicular to the optical axis. The prism wedge is preferably arranged proximal of the main body and cemented to the main body. The cemented prism wedge may be employed to prevent a beam shift error on the second image sensor, arranged in particular in the proximal direction perpendicular to the optical axis.
An “entrance surface” on the main body of the pentaprism is generally the surface of the pentaprism through which the incoming beams (incident beams) comprising the first and second spectral regions enter the pentaprism. A region of the entrance surface on the main body may act as an entrance aperture, and regions outside of an effective diameter of the incoming beam may be made opaque to incoming light. Preferably, incoming beams only enter the pentaprism via the entrance surface of the main body. Therefore, areas of the entrance surface of the main body and/or non-beam passing surface of the main body and the prism wedge may be coated with an opaque substance, such as a black paint, to avoid any stray or unwanted light from passing thereinto.
The “second internal reflective surface” is a surface of the main body of the pentaprism, on which the split first beams that are split by the dichroic beam splitting layer arranged between the main body and prism wedge are reflected internally in the main body and directed towards the first image sensor. The “second internal reflective surface” of the main body directs the split first beams perpendicular to the longitudinal axis of the pentaprism and to the image plane of the first image sensor. In particular, the second internal reflective surface is a mirror surface serving as internal reflection surface. Preferably, the second internal reflective surface is the only reflective surface of the main body with exception of the dichroic beam splitting layer.
A “first exit surface” is generally a surface on the main body of the pentaprism that is substantially parallel to the longitudinal axis of the pentaprism and/or the shaft. The first exit surface is in particular the longest side of the pentaprism in the sectional view along the longitudinal axis. The first beams split and reflected by the second internal reflective surface of the main body in particular leave the main body through the first exit surface. In particular, the surfaces of the main body of the pentaprism are arranged such that each an angle of 90° is arranged between the incoming beams entering the main body through the entrance surface and the split and reflected first beams leaving the main body through the first exit surface and/or between the first beams reflected by the dichroic beam splitting layer and the secondly reflected beams by the second internal reflective surface leaving the main body through the first exit surface.
A “dichroic beam splitting layer” in most embodiments is a thin layer that selectively reflects and transmits light depending on the light's wavelength. The dichroic beam splitting layer is selected such that the incident beams are split into the first beams of the first spectral region and the second beams of the second spectral region. The dichroic beam splitting layer is arranged between the main body and the prism wedge of the pentaprism. In particular, the entrance surface of the main body through which incoming beams pass through the main body is arranged substantially opposite to the surface of the main body at which, or adjacent to which, the dichroic beam splitting layer is arranged. The dichroic beam splitting layer is in particular directed at an incline to the optical axis, the longitudinal axis of the pentaprism, and/or the shaft, and towards a vertical direction perpendicular to the longitudinal axis and/or optical axis. The dichroic beam splitting layer can be a coating applied to the respective surface of the main body and/or the surface of the prism wedge, which are adjacent to each other. The adjacent surfaces of the main body and the prism wedge, with associated coating/coatings, may be adhered by optical cement. The combination of these coated surfaces and optical cement may thereby act as a dichroic filter. By the term “the dichroic beam splitting layer is arranged between the adjacent surfaces” it is understood that, in particular, the dichroic beam splitting layer and/or coating is arranged on at least one surface of both adjacent surfaces of the main body and the prism wedge.
A dichroic beam splitting layer can also be a dielectric coating or a dichroic mirror. The dichroic beam splitting layer is usually formed as a first internal reflective and transmissive splitting layer causing the first beams to be reflected internally and directed towards the second internal reflective surface of the main body and passing the second beams towards the prism wedge. In particular, the dichroic beam splitting layer is arranged such that any kind of two spectral regions can be split and therewith separated from each other. In particular, the dichroic beam splitting layer divides the incident beams into first beams with a spectrum of 400 to 658 nm in the visible range and to second beams with a wavelength of 800 to 950 nm in the near infrared range.
A “second exit surface” is a surface on the prism wedge, through which the transmitted second beams of the second spectral region leave the pentaprism. The second exit surface is generally arranged parallel to the entrance surface of the main body. The entrance surface of the main body is arranged on the distal side, and the second exit surface of the prism wedge is arranged on the proximal side of the pentaprism, the entrance surface of the main body and the second exit surface of the prism wedge being opposite to each other.
A “separate optical element” is any kind of optical element that modifies, controls and/or corrects a property of the first beams of the first spectral region or the second beams of the second spectral region. The separate optical element is, or two or several optical elements are, in particular arranged downstream of the pentaprism. Therewith, one separate optical element, or two or several separate optical elements, can be arranged between an outer surface of the pentaprism in the radial direction from the optical axis or along the optical axis and therewith the longitudinal axis of the pentaprism or the shaft. For a space-saving imaging device, the at least one separate optical element is arranged between the second exit surface of the prism wedge and the image sensor that is arranged with its image plane perpendicular to the optical axis and/or the longitudinal direction of the pentaprism and/or the shaft. Therefore, preferably, the one separate optical element is, or two or more separate optical elements are arranged on the proximal side of the pentaprism within the shaft of the endoscope.
The term “separate” when referring to the optical element is understood such that this optical element is usually not an integral part of the pentaprism and likewise not an integral part of either of the image sensors. Therefore, the separate optical element in most embodiments is not cemented to the second exit side of the prism wedge or connected to the pentaprism or the image sensor in a direct contact, however this is not limiting, and some embodiments are envisioned where these elements may be connected and/or in direct contact. Furthermore, the term “separate” can also mean that the pentaprism and the successive optical element have different optical functions within the imaging device.
The “optical axis” is, in particular, a line along which some degree of rotational symmetry exists in an optical system and is an imaginary line that defines the path along which the incoming beams propagate through an objective system in front of the imaging device or enter the imaging device. Preferably, the optical axis passes through the center of curvature of each optical element within a lens system and/or objective system and/or the first entrance surface of the pentaprism of imaging device. However, the optical axis can also be bent and/or directed by a lens, an optical element, the imaging device, and/or the at least one optical element downstream of the pentaprism and/or prism wedge.
An “image sensor” preferably has its sensor plane in an image plane of the imaging device. The sensor plane of the respective image sensor can be arranged substantially at a distance from, and in parallel or perpendicular the longitudinal axis of the pentaprism and/or of the shaft. In general, the terms “first image sensor” and “second image sensor” are only used for differentiation. Therefore, instead of the first image sensor, also the second image sensor can be arranged and vice versa. Preferably, the first image sensor is arranged parallel to the first exit surface arranged on the main body of the pentaprism, and the second image sensor is preferably arranged perpendicular to the longitudinal axis of the pentaprism and/or the shaft and therewith parallel to the second exit surface arranged on the prism wedge of the pentaprism. In preferred embodiments, the image size at the respective image sensor is at least 1.6 mm or above. The image planes of the first image sensor and the second image sensors can have the same size and/or properties or different sizes and/or properties. Preferably, the image sensor capturing fluorescence and/or infrared light has a smaller image size than the second image sensor capturing visible light. For example, the image sensor capturing visible light can have an image size diameter of 3.01 mm and a F-number of 6 and the image sensor capturing infrared light can have an image size diameter of 1.65 mm and a F-number of 2.8. The image sensor may be, for example, a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS). Preferably, the electronic image sensor is a high-definition (HD) image sensor having, for example, full HD resolution. In general, the electronic image sensor is configured to convert the captured image into electrical image signals and therefore image data. In particular, the electronic image sensor is arranged in the shaft and/or distal end section, e. g. the tip of the shaft, and transmits the electrical image signals from the shaft or the distal end of the shaft to its proximal end by electric transmission lines, such as wires, cables and/or a flexible printed circuit board. Preferably the electric image signals generated by the electronic image sensor are transferred from the shaft to the handle of the endoscope and/or a display system and/or a processing unit for displaying the captured images. Alternatively, the electrical image signals may be transmitted wirelessly, either directly from the shaft and/or the distal end section, or after being relayed to a transmitter contained in the handle. In case of a 3D-video endoscope, respectively, four image sensors are arranged in parallel, each set of two image sensors providing visible and fluorescent images from a given, distinct perspective.
Due to the separate optical element arranged behind the pentaprism in the proximal direction, the sensitivity of the image on the second image sensor for capturing fluorescent and/or IR light can be increased by reducing the size of the image by means of the at least one separate optical element. For example, hereby, the image size can be decreased by a factor of 45% and therefore the image size of the second sensor can be reduced from 3.01 mm to 1.65 mm. Consequently, this results in different system F-numbers, wherein the first image sensor for visible light has an image size diameter of 3.01 mm and a F-number of 6 and the second image sensor has a reduced image size diameter of 1.65 mm and a F-number of 2.8. Hereby, the light intensity of the second sensor capturing fluorescent light and/or IR light in relation to the visible beam path was increased by a factor of 4.5.
With respect to “the first spectral region and the second spectral region differ at least partially from each other” it is understood that the first spectral region, and therewith a first wavelength band, and the second spectral region, and therewith a second wavelength band, do not include exactly the same wavelengths. However, the first spectral region may include completely the second spectral region or vice versa. For example, the first spectral region may comprise the wavelength range of 400 nm to 900 nm and the second spectral region may comprise the wavelength range of 700 nm to 800 nm.
“White light” (also called “visible light”) is usually understood to refer to a combination wavelengths of light at from 380 nm to 750 nm, that is between the ultraviolet and infrared regions, that is, electromagnetic radiation within the portion of the spectrum perceived by a healthy human eye.
“Fluorescent light” is generally indicated to mean an emission of light by a substance called a fluorophore that has absorbed light or other magnetic radiation. The fluorophore is usually irradiated with a specific excitation wavelength or wavelength band resulting in the emission of light with a specific emission wavelength or wavelength band. Normally, the emission wavelength is longer than the excitation wavelength. For example, in case of the commonly used fluorophore indocyanine green (ICG), the excitation wavelength range is between 600 nm and 900 nm and the emission wavelength range is between 750 nm and 950 nm in the IR spectrum. In fluorescence imaging, which is often used to optically define a tumorous region during surgery, a biological material, such a tissue in a body cavity is dyed with a fluorophore directly, or an administered substance is converted into a fluorophore by the body or a microorganism prior to imaging with an endoscope. Additionally, auto fluorescence can also be observed without previous colorization by a fluorophore or dye.
Consequently, “fluorescence light” may refer to the excitation and/or emission wavelengths or wavelength bands of a fluorophore. Radiation that causes a fluorophore to emit fluorescent light is generally referred to as “excitation light” and the resulting light emitted from the fluorophore is referred to as “emission light” or “fluorescent light”. In fluorescence imaging, an optional fluorescence filter can block the excitation wavelength from reaching the detecting image sensor, and therefore the fluorescence light comprises only the light emitted by the fluorophore.
The “refractive index” of an optical medium is a dimensionless number that gives an indication of the light bending ability of the medium. The refractive index determines how much the path of light is bent or refracted when entering the optical medium. For example, for the pentaprism including the prism wedge a refractive index of 1.6 or above is preferred.
A “mirror element” may be any optical element that reflects light and/or redirects light. The mirror element comprises at least one reflective surface. The mirror element can be, for example, a planar mirror surface and/or an additional reflective prism.
Turning now to the embodiments shown in the drawings, the video endoscope 101, as shown in
The video endoscope 101 is designed to provide video and image data from an object space within a cavity of a body (not shown). For this, the elongate shaft 105 comprises at its distal end 109 a distal end section 111. The distal end section 111 of the elongate shaft 105 comprises an objective system 301 with an objective lens system 303 followed by an imaging device 400 on a proximal side 329 (see
The imaging device 400 comprises a pentaprism 401 with a prism main body 403 and prism wedge 405. The prism wedge 405 is cemented to the prism main body 403. Furthermore, the imaging device 400 comprises, after the pentaprism 401 along an optical axis 373, a fluorescence filter 419 and a combined demagnification lens 421. A first image sensor 423 for capturing white light is arranged parallel to a first exit surface 413 of the pentaprism 401 and therewith parallel to the optical axis 373. On the proximal side of the demagnification lens 421, a second image sensor 425 is arranged for capturing fluorescence light (
The pentaprism 401 comprises an entrance surface 407 on its prism main body 403, that is arranged perpendicular to incident beams 311 and the optical axis 373. Furthermore, the prism main body 403 comprises a second internal reflective surface 411 and a first exit surface 413 directed towards the first image sensor 423. The prism wedge 405 is arranged and cemented inclined to the prism main body 403 on the proximal side of the prism main body 403. The prism wedge 405 comprises a second exit surface 415 perpendicular to the optical axis 373. A dichroic beam splitting layer 417 is arranged between the prism main body 403 and the prism wedge 405 forming a first internal reflective and transmissive surface 409.
The objective lens system can be arranged directly at the distal tip of the shaft and/or in the distal end section of the shaft, while the imaging device can be arranged at a distance more on the proximal side or proximal section of the shaft. By this, one intermediate image or two or more intermediate images can be provided along the optical axis in the proximal direction between the objective lens system and the imaging device providing a longer focal length. Therewith, for example, the distance between the most proximal lens of the objective lens system and the respective image sensor arranged along the optical axis can be longer with i.e. 70 mm, instead of conventional of 20 mm. Consequently, it is not necessary to use a larger video objective and a consequently larger prism, which would limit the spatial arrangement of the two image sensors. By arranging the imaging device more in the proximal direction with a distance to the most proximal lens of the objective lens system, a smaller and compact design of the pentaprism and a flexible arrangement of the two image sensors can be applied.
Therewith, the function of the pentaprism is integrated into the overall system by a special optical design for video endoscopes, wherein at least one intermediate image after the most proximal lens allows a better control of the image-side focal length to reach the entire distance through the pentaprism. Additionally, the stray light behavior of the complete optical system is improved. For example, the optical design is arranged for the following optical parameters with: diameter of the optical system of 3.1 mm, focal length of −0.8 mm, F-number 4.5 to 6.0, diameter of the entrance pupil of 0.19 to 0.14, achromatized for a wavelength range of 400 to 950 nm and an image scale of 0.018.
In the objective system 301, the lens system 303 comprises, along the optical axis 373 from the distal side 327 towards the proximal side 329, a cover glass 339, a first most distal objective lens 340, followed by an attenuating and/or relaying optical element 341, a second lens 343 formed as a combined lens, a third lens 345, and a fourth lens 347 arranged in the distal end section 111 of the elongate shaft 105. At a distance to the fourth lens 347 along the optical axis 373 in a proximal direction, is a fifth lens 351 formed as a combined lens. At a distance from the fifth lens 351 is a sixth lens 353 formed as a combined lens, and at a distance therefrom is a seventh lens 355 likewise formed as a combined lens and being the most proximal lens of the lens system 303. After the seventh lens 355, as the most proximal lens, the imaging device 400 with the pentaprism 401 as described is arranged on the proximal side 329.
The elongate shaft 105 has an inner diameter of 4 mm and the objective system 301 a diameter of 3.1 mm. The pentaprism 401 has a length along the optical axis 373 of 5 mm and the overall imaging device 400 including the pentaprism 401 of 10 mm.
In using the video endoscope 101, object beams 309 coming from the object space at the distal side 327 are collected by the objective system 301 and conditioned by the objective lens system 303 in a single optical path with converging beams. After the last, seventh lens 355 of the lens system 303, incident beams 311 enter the pentaprism 401 through the entrance surface 407 into the prism main body 403. As both the prism main body 403 and the prism wedge 405 comprise a refraction index of 0.6, the entered incident beams 311 are split selectively by the dichroic beam splitting layer 417 into white light beams 323 and fluorescence beams 325. The reflected white light beams 323 are directed towards the second internal reflective surface 411 at a first angle 431 of 70° relative to the first internal reflective and transmissive surface 409. The split and redirected white light beams 323 are reflected by the second internal reflective surface 411 of the prism main body 403 with a second angle 433 of 65° relative to the second internal reflective surface 411 and in the direction of the first image sensor 423. Hereby, the secondly reflected white light beams 325 intersect the optical axis 373 with an angle of 90° and have a third angle 435 of 90° relative to the first exit surface 413 of the prism main body 403, wherein the beams pass the first exit surface 413, an air gap, and then, through a first sensor cover glass 427, intersect the image plane of the first image sensor 423. The fluorescence beams 325 pass the dichroic beam splitting layer 417 along the optical axis 373 and leave the prism wedge 405 through the second exit surface 415 in the proximal direction (see
The fluorescence filter 419 arranged separately on the proximal side of the second exit surface 415 serves to absorb the excitation wavelength of the fluorescent light and only passes the emission wavelength band of the fluorophore used in the object space. Subsequently, the demagnification lens 421 converges the fluorescent beams 325 further before these beams pass an air gap, a second sensor cover glass 429, and intersect the image plane of the second image sensor 425 for capturing fluorescence light. Thereby, a smaller image size diameter of 1.65 mm is used in the second image sensor 425 capturing fluorescence light in contrast to an image size diameter of 3.01 mm of the first image sensor 423 capturing white light. Respectively, the first image sensor 423 has a F-number of 6.0 and the second image sensor 425 has a F-number of 2.8. Due to the demagnification lens 421, this reduced image size of the image captured by the second image sensor 425 is enabled, increasing the intensity of the image relative to image size on the second image sensor 425 for fluorescence light, improving the sensitivity of the image sensor 425 to the fluorescent image by increasing the signal-to-noise ratio. Simultaneously, by the demagnification lens 421, aberrations are inhibited in image capturing by the second image sensor 425.
Therewith, a video endoscope 101 is provided with a compact imaging system 400 and a compact pentaprism 401 providing beam splitting into white light and fluorescence light arranged in the narrow shaft 105 allowing the simultaneously capturing of separate white light beams 323 and fluorescent beams 325 by separate first and second image sensors 423, 425 with high-quality images and a high frame rate of, for example, 60 frames per second, which would not be possible, or would be highly impractical with conventional systems.
The above-described exemplary embodiments are intended to illustrate the principles of the disclosed technology, but not to limit the scope of the disclosed technology. Various other embodiments and modifications to these exemplary, non-exhaustive embodiments may be made by those skilled in the art without departing from the scope of the disclosed technology. For example, in some instances, one or more features disclosed in connection with one embodiment can be used alone or in combination with one or more features of one or more other embodiments. More generally, the various features described herein may be used in any working combination.
Number | Date | Country | Kind |
---|---|---|---|
102023130062.8 | Oct 2023 | DE | national |