The invention relates generally to endoscopy systems and, in particular, to image capture and video processing systems and methods in multiple-illuminator endoscope systems.
Free space is an extremely valuable resource within a multiple camera endoscope tip section. Such tip sections typically include a plurality of cameras, a plurality of optical systems, a plurality of illuminators, a flexible electronic circuit board configured to support and encapsulate the components and a working channel configured for the injection of fluids and for the insertion of miniscule surgery tools.
An optical system for a tip section of a multiple sensor endoscope comprising a front-pointing camera sensor, a front objective lens system, a side-pointing camera-sensor, and a side objective lens system is disclosed in U.S. patent application Ser. No. 13/882,004, entitled “Optical Systems for Multi-Sensor Endoscopes” and filed on May 23, 2013, which is herein incorporated by reference in its entirety.
A flexible electronic circuit board for a multiple camera endoscope tip section is disclosed in Patent Cooperation Treaty Application Number PCT/IL2011/050049, entitled “Flexible Electronic Circuit Board for a Multi-Camera Endoscope” and filed on Dec. 8, 2011, which is herein incorporated by reference in its entirety. The circuit board comprises: a front camera surface configured to carry a forward looking camera; a first side camera surface configured to carry a first side looking camera; a second side camera surface configured to carry a second side looking camera; one or more front illuminator surfaces configured to carry one or more front illuminators; and, one or more side illuminators surfaces configured to carry one or more side illuminators.
The flexible circuit board is connected to the central control unit via a multi-wire cable. The multi-wire cable is welded on the board in a designated location, freeing additional space within the tip section assembly and adding flexibility to the cable access.
A multiple sensor or multiple viewing elements endoscope tip section comprising a front-pointing camera and two or more side-pointing cameras positioned at or in proximity to a distal end of the tip section and a working channel configured for insertion of a surgical tool is disclosed in U.S. patent application Ser. No. 13/655,120, entitled “Multi-Camera Endoscope” and filed on Oct. 18, 2012, which is herein incorporated by reference in its entirety, and assigned to the Applicant of the present specification. As described in the '120 application, the field of view (FOV) of each camera sensor in a multiple sensor endoscope is illuminated by two or more illuminators that are light emitting diodes (LEDs). Thus, multiple sensor endoscopes' tips that include a right pointing camera or viewing element, a front pointing camera or viewing element and a left pointing camera or viewing element may include a minimum of 9 or more LEDs. Since the FOVs' depth in different orientations, for example in a patient's colon, can vary significantly during a colonoscopy procedure, illuminating all LEDs with a fixed illumination intensity is sub-optimal, may be too weak in some orientations for example and may drive the camera sensor arrays beyond their dazzle limits due to light reflection from a nearby wall in other orientations.
One approach for controlling the illumination of a multiple illuminator endoscope system may be provided by dynamically controlling the emitted light intensities. However, since multiple illuminator endoscope systems may include 10 or more illuminators, controlling the light intensity of each illuminator independent of the other illuminators dynamically may be a difficult task. Another approach for controlling the illumination of multiple illuminator endoscope systems is provided by dynamically actuating electro and/or electro-mechanical actuatable lenses.
It would also be highly advantageous to provide lens actuation systems used to dynamically redirect the illumination of multiple illuminator endoscopes.
The present specification discloses a lens actuation system for an endoscope, the system comprising: at least one viewing element configured to capture images; a plurality of illuminators configured to illuminate a plurality of fields of view (FOVs) associated with the at least one viewing element; a controller; and at least one actuatable lens positioned in front of one of the plurality of illuminators, wherein the at least one actuatable lens is configured to dynamically change an illumination direction of the one of the plurality of illuminators based upon a signal from said controller.
Optionally, the lens actuation system comprises a front viewing element with at least two front illuminators and a side viewing element with at least two side illuminators, wherein an actuatable lens is positioned in front of each of the two front illuminators and the two side illuminators to dynamically change an illumination direction of one or both illuminators of the front and side illuminators based upon a signal from the controller.
Optionally, the lens actuation system comprises a front viewing element with at least two front illuminators and a side viewing element with at least two side illuminators, wherein an actuatable lens is positioned in front of the front and side viewing elements to dynamically change direction of incoming light beams to the front and side viewing elements based upon a signal from said controller.
The present specification also discloses a lens actuation system for an endoscope, the system comprising: at least one viewing element configured to capture images; a plurality of illuminators configured to illuminate a plurality of fields of view (FOVs) associated with the at least one viewing element; a controller; and an actuatable lens positioned in front of the at least one viewing element, wherein the actuatable lens is configured to dynamically change direction of incoming light beams, from the plurality of FOVs to the at least one viewing element, based upon a signal from the controller.
The present specification also discloses a lens actuation system for an endoscope, the system comprising: at least one viewing element configured to capture images; a plurality of illuminators configured to illuminate a plurality of fields of view (FOVs) associated with the at least one viewing element; a controller; and actuatable lenses positioned in front of the at least one viewing element and in front of at least one of the plurality of illuminators, wherein, based upon a signal from the controller, the actuatable lenses are configured to dynamically change one or both of a) an illumination direction of said at least one of said plurality of illuminators, and b) a direction of incoming light beams to said at least one viewing element,
The plurality of FOVs may partially overlap.
An actuatable lens (or lenses) may be positioned in front of each of the plurality of illuminators, or an actuatable lens (or lenses) may be positioned in front of more than one of the plurality of illuminators.
Optionally, the actuatable lens (or lenses) is a stiff lens positioned proximate to a plurality of electro actuators configured to move the stiff lens.
Optionally, the actuatable lens (or lenses) is a flexible lens positioned proximate to a plurality of actuators configured to deform the flexible lens. Optionally, the flexible lens is a silicon lens and the deformation of the flexible lens is selected from a group consisting of: contracting, expanding, pulling, pushing and combinations thereof.
The actuatable, lens (or lenses) may be coated with an electro responsive material that changes its local light refraction index in response to an applied electric field, Optionally, the electro responsive coating material is selected from a group consisting of: a liquid crystal, an electro responsive polymer, inorganic crystals, metamaterials or combinations thereof. Optionally, the electric field is applied by a plurality of electrodes.
Optionally, the actuatable lens (or lenses) dynamically changes an illumination direction by rotating or translating or a combination thereof.
Optionally, where light intensity exceeds a predetermined threshold at a certain FOV, due to at least a partial overlap of illumination from more than one of the plurality of illuminators, the at least one actuatable lens is configured to dynamically redirect illumination from at least one of the plurality of illuminators in order to reduce light intensity at the certain FOV.
Optionally, where light intensity is below a predetermined threshold at a certain FOV, the at least one actuatable lens is configured to dynamically redirect illumination from at least one of the plurality of illuminators in order to increase light intensity at the certain FOV.
The at least one actuatable lens may be actuated based on a detection of bright and/or dark areas in the plurality of FOVs.
Optionally, a user interface is configured to allow a user to manually actuate the at least one actuatable lens according to desired light intensity of images presented on a display.
Optionally, a user interface is configured to allow a user to actuate the at least one actuatable lens in order modify a blooming effect, saturation effect, underexposure effect, or overexposure effect.
The present specification also discloses a method of controlling illumination of an endoscope tip, the method comprising: providing, at the endoscope tip, at least one viewing element configured to capture images, a plurality of illuminators configured to illuminate a plurality of field of views (FOVs) associated with the at least one viewing element and at least one actuatable lens positioned in front of at least one of the plurality of illuminators; receiving, from the at least one viewing element, images of a plurality of FOVs illuminated by the plurality of illuminators; and dynamically redirecting illumination by actuating the at least one actuatable lens in order to reduce light intensity of a too bright captured image or increase light intensity of a too dark captured image.
The aforementioned and other embodiments of the present specification shall be described in greater depth in the drawings and detailed description provided below.
These and other features and advantages of the present specification will be further appreciated, as they become better understood by reference to the detailed description when considered in connection with the accompanying drawings:
In the description and claims of the present specification, each of the words “comprise”, “include”, and “have”, and forms thereof, are not necessarily limited to members in a list with which the words may be associated.
The present specification is directed toward multiple embodiments. The following disclosure is provided in order to enable a person having ordinary skill in the art to practice the invention. Language used in this specification should not be interpreted as a general disavowal of any one specific embodiment or used to limit the claims beyond the meaning of the terms used therein. The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Also, the terminology and phraseology used is for the purpose of describing exemplary embodiments and should not be considered limiting. Thus, the present specification is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed. For purpose of clarity, details relating to technical material that is known in the technical fields related to the invention have not been described in detail so as not to unnecessarily obscure the present invention.
Reference is now made to
In accordance with various embodiments, viewing elements, cameras or sensors 103, 108 and 111 are Charge Coupled Device (CCD) or Complementary Metal Oxide Semiconductor (CMOS) image sensor arrays. Also, front illuminators 104, 105, 106, 107 and side illuminators 101, 102, 109, 111 are, in an embodiment, discrete illuminators and include a light-emitting diode (LED), which may be a white light LED, an infrared light LED, a near infrared light LED, an ultraviolet light LED or any other LED. The term “discrete”, concerning discrete illuminator, refers to an illumination source, which generates light locally and internally, in contrast to a non-discrete illuminator, which may be, for example, a fiber optic merely transmitting light generated remotely.
It should be understood that the endoscope tip section 100 includes a working channel, having an opening positioned on the front face 140 that is configured to inject fluids or gases and to insert surgical tools, a plurality of optical systems that may include front and side objective lens systems, a flexible electronic circuit board configured to carry the front and side viewing elements along with the associated illuminators, the wiring connections between these components and a cable connecting the parallel illuminating system of the endoscopic tip 100 to an endoscope handle, to an external control unit and to a display.
Reference is now made to
A utility cable 214, also referred to as an umbilical tube, connects between the handle 204 and a Main Control Unit 299. Utility cable 214 includes therein one or more fluid channels and one or more electrical channels. The electrical channel(s) include at least one data cable for receiving video signals from the front and side-pointing viewing elements, as well as at least one power cable for providing electrical power to the viewing elements and to the discrete illuminators (that is, to the parallel illuminating system).
The main control unit 299 contains the controls required for displaying the images and/or videos of internal organs captured by the endoscope 202. The main control unit 299 governs power transmission to the endoscope's 202 tip section 100, such as for the tip section's viewing elements and illuminators. The main control unit 299 further controls one or more fluid, liquid and/or suction pump(s) which supply corresponding functionalities to the endoscope 202. One or more input devices 218, such as a keyboard, a touch screen and the like is connected to the main control unit 299 for the purpose of human interaction with the main control unit 299. In the embodiment shown in
It should be appreciated that the parallel illuminating system, illustrated in
Reference is now made to
Reference is now made to
Persons of ordinary skill in the art should note that depending upon the type of actuatable lens the actuating devices or systems may vary. Thus, in various embodiments, depending upon whether the actuatable lens is stiff, flexible or coated (with an electro responsive coating) the actuating devices could be piezoelectric actuators, electrodes, other miniature electromechanical actuators, such as but not limited to, linear step motors or step motors with miniature gears or electromagnets or micro electro mechanical systems (MEMS), which include electromechanical mechanisms controlled by electrostatic field.
It should be appreciated that the viewing elements, that are typically CCD or CMOS image sensors, have phenomena such as saturation and blooming that affect both their quantitative and qualitative imaging characteristics. For example, if each individual pixel can be thought of as a well of electrons, then saturation refers to the condition where the well becomes filled. The amount of charge that can be accumulated in a single pixel is determined largely by its area. However, due to the nature of the potential well, which holds charge within a pixel, there is less probability of trapping an electron within a well that is approaching saturation. Therefore, as a well approaches its limit, the linear relationship between light intensity and signal degrades. As a result, the apparent responsivity of a saturated pixel drops. At saturation, pixels lose their ability to accommodate additional charge. This additional charge then spreads into neighboring pixels, causing them to either report erroneous values or also saturate. This spread of charge to adjacent pixels is known as blooming and appears as a white streak or blob in the image. The occurrence of blooming, in video images generated by a multiple viewing elements endoscope, results in loss of details in portions of the video image that is a serious cause of concern for a physician performing an endoscopic procedure.
Therefore, according to an aspect of the present specification, the outgoing illumination or FOVs of a plurality of illuminators of a multi illuminator endoscopic tip are maneuverable for dynamic redirection. According to another aspect of the present specification, a lens of at least one viewing element of the multi illuminator endoscope tip is also moveable or maneuverable to dynamically redirect incoming light beams from a plurality of FOVs to the viewing element.
Reference is now made to
Reference is now made to
According to an aspect of the present specification, a lens actuation system is used to dynamically redirect the illumination or FOV of at least one of the illuminators and/or of the lens of at least one viewing element, of a multi illuminator endoscopic tip. In various embodiments, the lens actuation system comprises an actuatable lens and piezoelectric actuators associated with an illuminator and/or with a lens of a viewing element. Referring back to
In some embodiments, the lens actuation system 500 is associated with a stiff lens of at least one illuminator, such as the lens 301B of the illuminator 301 of
In some embodiments, the lens actuation system 600 is associated with a flexible lens of at least one illuminator, such as the lens 301B of the illuminator 301 of
In some embodiments, the lens actuation system 700 is associated with an electro responsive coated lens of at least one illuminator, such as the lens 301B of the illuminator 301 of
Referring back to
In accordance with an embodiment, the FPGA calculates the total average brightness of an image frame acquired by a viewing element. In another embodiment, the FPGA, around every luminance pixel sample arriving from the viewing element calculates a Gaussian (averaging process using Gaussian weighing) of its neighborhood pixels. Hence, the Gaussian is the local brightness around the luminance pixel sample. In alternate embodiment, however, the image processing, to determine the total average or Gaussian local brightness, is performed by a software program or by hardware processors such as an ASIC processor or a micro-controller and the like processing the data received from a viewing element. A local blooming control module, that calculates and uses a Gaussian local brightness to control blooming is described in U.S. Provisional Patent Application No. 62/093,871, entitled “System and Method for Processing Video Images Generated By A Multiple Viewing Elements Endoscope” and filed on Dec. 18, 2014, which is herein incorporated by reference in its entirety.
The total average brightness or the Gaussian local brightness, of an image or video frame, is compared with an upper threshold brightness level and a lower threshold brightness level. If the total average brightness or the Gaussian local brightness, of an image or video frame, is above the upper threshold brightness level it is indicative of an over exposed, over lighting, saturation or blooming regions within an image while if this is below the lower threshold brightness then it is indicative of under exposed, under lighted, too dark or dim regions. In one embodiment the upper and lower threshold brightness levels are pre-set by default (based on empirically determined optimal brightness preferences of a representative universal set of physicians). The range of acceptable brightness, as defined by the upper and lower thresholds, is further customizable by the physician depending upon his/her visual preference.
In one embodiment, areas or regions of an image or video frame having a Gaussian local brightness higher than the upper threshold brightness, intensity or luminance level are identified or segmented as being too bright, saturated or over exposed. Similarly, areas or regions of the image or video frame having, for example, Gaussian local brightness lower than the lower threshold brightness, intensity or luminance level are identified or segmented as being too dim or under exposed. Depending upon the viewing element that acquired the image or video frame, the main control unit automatically maneuvers the plurality of actuating devices associated with the plurality of illuminators that are used to illuminate the FOV of the viewing element. In one illustrative yet non-limiting example, as shown in
Similarly, if the identified or segmented region is under exposed, such as for example in the corners of the image then the actuatable lens of at least one illuminator (that illuminates the FOV of the viewing element responsible for capturing the image) is maneuvered to move the FOV of the at least one illuminator towards the under exposed region till the Gaussian local brightness level of the under exposed region falls within the range defined by the upper and lower threshold brightness levels. The actuatable lenses of one or more concerned illuminators are manipulated to ensure that the brightness or intensity levels of none of the regions or segments of the image or video frame fall outside the defined or acceptable upper and lower threshold brightness levels.
It should be appreciated that the duration and extent of electrical stimulation of the plurality of actuating devices or sensors and the resultant direction and type of movement of the actuatable lens of the illuminator(s) is determined based on at least a) the identified region or segment, b) the brightness level (such as the Gaussian local brightness) of the identified region or segment and c) the type of actuatable lens (stiff, flexible or coated) of the concerned illuminator(s) that need to be manipulated. Also, as discussed earlier in the specification with reference to
In another embodiment, the actuatable lenses of the illuminators are maneuvered manually by the physician using a plurality of buttons or switches available on the handle of the endoscope (such as the buttons 205 on the handle 204 of the endoscope 202 of
At step 835, the Gaussian local brightness values or levels of the acquired image or video frame are compared with an upper threshold brightness level and a lower threshold brightness level. This comparison is used to identify or segment those areas or regions of the acquired image or video frame having Gaussian local brightness levels above the upper threshold brightness level and/or below the lower threshold brightness level. Thus, at step 835, if the Gaussian local brightness level of a region or area is determined to be above the upper threshold brightness level then the region is identified or segmented as being too bright, saturated or over exposed. As a result of this identification, at step 845, the main control unit causes at least one of the plurality of transducers to maneuver the at least one actuatable lens so as to redirect the illumination FOV of the at least one illuminator with reference to the other illuminators till the Gaussian local brightness level of the identified over exposed region falls within the range defined by the upper and lower threshold brightness levels.
At step 855, if the Gaussian local brightness level of a region or area is determined to be below the lower threshold brightness level then the region is identified or segmented as being too dim or under exposed. As a result of this identification, at step 865, the main control unit causes at least one of the plurality of transducers to maneuver the at least one actuatable lens so as to redirect the illumination FOV of the at least one illuminator with reference to the other illuminators till the Gaussian local brightness level of the identified under exposed region falls within the range defined by the upper and lower threshold brightness levels.
The above examples are merely illustrative of the many applications of the system of present specification. Although only a few embodiments of the present invention have been described herein, it should be understood that the present invention might be embodied in many other specific forms without departing from the spirit or scope of the invention. Therefore, the present examples and embodiments are to be considered as illustrative and not restrictive, and the invention may be modified within the scope of the appended claims.
The present application relies on, for priority, U.S. Provisional Patent Application No. 61/989,895, entitled “Multi-Illuminator Endoscopic Lens Actuation Systems” and filed on May 7, 2014, which is herein incorporated by reference in its entirety. The present application relates to U.S. patent application Ser. No. 14/603,137, entitled “Image Capture and Video Processing Systems and Methods for Multiple Viewing Element Endoscopes”, filed on Jan. 22, 2015, which relies on U.S. Provisional Patent Application No. 61/930,101, entitled “Daisy Chain Multi-Sensor Endoscopic System” and filed on Jan. 22, 2014 and U.S. Provisional Patent Application No. 61/948,012, entitled “Parallel Illuminating Systems” and filed on Mar. 4, 2014. The present application also relates to U.S. patent application Ser. No. 13/655,120, entitled “Multi-Viewing Element Endoscope”, and filed on Oct. 18, 2012. In addition, the present application also relates to U.S. patent application Ser. No. 13/882,004, entitled “Optical System for Multi-Sensor Endoscopes”, filed on Apr. 26, 2013, which is a 371 National Stage Entry of PCT Application Number PCT/IL11/000832, of the same title, and filed on Oct. 27, 2011, which, in turn, relies upon U.S. Provisional Patent Application No. 61/407,495, filed on Oct. 28, 2010. The present application also relates to U.S. patent application Ser. No. 13/992,014, entitled “Flexible Electronic Circuit Board for a Multi-Camera Endoscope”, filed on Jun. 6, 2013, which is a 371 National Stage Entry of PCT Application Number PCT/IL11/050049, of the same title, and filed on Dec. 8, 2011, which, in turn, relies upon U.S. Provisional Patent Application No. 61/421,238, filed on Dec. 9, 2010. All of the above-mentioned applications are herein incorporated by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
61989895 | May 2014 | US |