Device and method for reducing effects of video artifacts

Information

  • Patent Grant
  • 8310530
  • Patent Number
    8,310,530
  • Date Filed
    Monday, May 21, 2007
    17 years ago
  • Date Issued
    Tuesday, November 13, 2012
    11 years ago
Abstract
A method for reducing an effect of a video artifact includes adjusting a phase of a second imaging device's video clock signal so that a phase of the second imaging device's video synchronization signal matches a phase of a first imaging device's video synchronization signal. An endoscopic system includes a first imaging device, a second imaging device, a light source, and a controller that reduces an artifact in an image produced by the first imaging device. In some embodiments, the first imaging device faces the light source.
Description
TECHNICAL FIELD OF THE INVENTION

The present invention relates to a device and method for reducing effects of video artifacts.


BACKGROUND OF THE INVENTION

Multiple endoscopic devices with multiple cameras and light sources may be used for medical procedures, inspection of small pipes, or remote monitoring. For example, such an endoscopic device may be a medical endoscope comprising a flexible tube, and a camera and a light source mounted on the distal end of the flexible tube. The endoscope is insertable into an internal body cavity through a body orifice to examine the body cavity and tissues for diagnosis. The tube of the endoscope has one or more longitudinal channels, through which an instrument can reach the body cavity to take samples of suspicious tissues or to perform other surgical procedures such as polypectomy.


There are many types of endoscopes, and they are named in relation to the organs or areas with which they are used. For example, gastroscopes are used for examination and treatment of the esophagus, stomach and duodenum; colonoscopes for the colon; bronchoscopes for the bronchi; laparoscopes for the peritoneal cavity; sigmoidoscopes for the rectum and the sigmoid colon; arthroscopes for joints; cystoscopes for the urinary bladder; and angioscopes for the examination of blood vessels.


Each endoscope has a single forward viewing camera mounted at the distal end of the flexible tube to transmit an image to an eyepiece or video camera at the proximal end. The camera is used to assist a medical professional in advancing the endoscope into a body cavity and looking for abnormalities. The camera provides the medical professional with a two-dimensional view from the distal end of the endoscope. To capture an image from a different angle or in a different portion, the endoscope must be repositioned or moved back and forth. Repositioning and movement of the endoscope prolongs the procedure and causes added discomfort, complications, and risks to the patient. Additionally, in an environment similar to the lower gastro-intestinal tract, flexures, tissue folds and unusual geometries of the organ may prevent the endoscope's camera from viewing all areas of the organ. The unseen area may cause a potentially malignant (cancerous) polyp to be missed.


This problem can be overcome by providing an auxiliary camera and an auxiliary light source. The auxiliary camera and light source can be oriented to face the main camera and light source, thus providing an image of areas not viewable by the endoscope's main camera. This arrangement of cameras and light sources can provide both front and rear views of an area or an abnormality. In the case of polypectomy where a polyp is excised by placing a wire loop around the base of the polyp, the camera arrangement allows better placement of the wire loop to minimize damage to the adjacent healthy tissue.


The two cameras may be based on different technologies and may have different characteristics. In many cases, the main camera is a charge coupled device (CCD) camera that requires a very intense light source for illumination. Such as a light source may be a fiber optic bundle. The auxiliary camera may be a complementary metal oxide semiconductor (CMOS) camera with a light emitting diode (LED) to provide illumination.


SUMMARY OF THE INVENTION

The inventors of the present application have observed that, when multiple imaging and light emitting devices are used as described in the background section of the present specification, artifacts may appear on the video images produced by the imaging devices. One example of the observed artifacts is a “thin line” near the top edge of a video image generated by a main endoscope's CCD imaging device, when the CCD imaging device is used in pair with retrograde imaging and light emitting devices.


The inventors believe that this “thin line” artifact is caused by how images are captured and/or processed by the CCD imaging device. Alternatively or additionally, the artifact may be related the video processing circuitry. In a CCD camera system, image data are captured and/or processed a single row of the image at a time. As a result, if there is a bright light source such as the retrograde light emitting device, individual pixels of the image can succumb to charge “leaks” which can spill over into the other light receptors in the same row. This may cause loss of portions of the video image and appearance of a “thin line” in the video image.


The present invention can be used to reduce the effects of the “thin line” artifact. In accordance with one aspect of the invention, an endoscopic system includes a first imaging device, a second imaging device, a light source, and a controller that reduces an artifact in an image produced by the first imaging device. In some embodiments, the first imaging device faces the light source.


In one preferred embodiment, the controller adjusts a frequency of the second imaging device's video clock signal so that a frequency of the second imaging device's video synchronization signal matches a frequency of the first imaging device's video synchronization signal.


In another preferred embodiment, the controller adjusts a phase of the second imaging device's video clock signal to vary the phase between the second imaging device's video synchronization signal and the first imaging device's video synchronization signal. The phase between the two video synchronization signals may be zero or nonzero.


In still another preferred embodiment, the controller synchronizes a duty cycle of the light source to turn on the light source only when the first imaging device is in a vertical blanking interval to reduce the size of the artifact.


In yet another preferred embodiment, the controller moves the artifact by adjusting a pulse width and/or delay timing of the light source. Preferably, the controller moves the artifact vertically.


In still yet another preferred embodiment, wherein the controller includes a phase lock loop circuit that is connected to the first imaging device to receive a video synchronization signal of the first imaging device and connected to the second imaging device to receive a video synchronization signal of the second imaging device and to send a video clock signal for the second imaging device, wherein the phase lock loop circuit adjusts a phase of the second imaging device's video clock signal so that a phase of the second imaging device's video synchronization signal matches a phase of the first imaging device's video synchronization signal.


In a further preferred embodiment, the phase lock loop circuit adjusts a frequency of the second imaging device's video clock signal so that a frequency of the second imaging device's video synchronization signal matches a frequency of the first imaging device's video synchronization signal.


In a still further preferred embodiment, the controller includes a light source driver, and the light source driver is connected to the phase lock loop circuit to receive the video clock signal. Preferably, the light source driver synchronizes a duty cycle of the light source to turn on the light source only when the first imaging device is in a vertical blanking interval to reduce the size of the artifact.


In a yet further preferred embodiment, the light source driver moves the artifact by adjusting a pulse width and/or delay timing of the light source. Preferably, the controller moves the artifact vertically.


In a still yet further preferred embodiment, the phase lock loop circuit includes a sync separator that is connected to the first imaging device to receive the video synchronization signal of the first imaging device and connected to the second imaging device to receive the video synchronization signal of the second imaging device. Preferably, the sync separator extracts a vertical synchronization signal from the video synchronization signal of the first imaging device and another vertical synchronization signal from the video synchronization signal of the second imaging device.


In another preferred embodiment, the phase lock loop circuit includes a phase detector that is connected to the sync separator to receive the vertical synchronization signals. Preferably, the phase detector computes the phase difference between the vertical synchronization signals using the vertical synchronization signal of the first imaging device as a reference signal.


In still another preferred embodiment, the phase lock loop circuit includes a low pass filter that is connected to the phase detector to receive the phase difference and that averages the phase difference to reduce the noise content of the phase difference.


In yet another preferred embodiment, the phase lock loop circuit includes an oscillator that is connected to the low pass filter to receive the averaged phase difference and that creates the video clock signal.


In accordance with another aspect of the invention, an endoscopic system includes a first imaging device, a second imaging device, a light source, and a controller that varies a phase difference between video synchronization signals of the first and second imaging devices to reduce an artifact in an image produced by the first imaging device.


In accordance with still another aspect of the invention, an endoscopic system includes a first imaging device, a first light source, a second imaging device, a second light source, and a controller. The first imaging device and light source may face the second imaging device and light source. Preferably, the controller includes a phase lock loop circuit that is connected to the first imaging device to receive a video synchronization signal of the first imaging device and connected to the second imaging device to receive a video synchronization signal of the second imaging device and to send a video clock signal for the second imaging device so that image frames of the imaging devices have the same frequency and are in phase. The first imaging device and light source may be powered on during one half of the frame period, and the second imaging device and light source are powered on during the other half of the frame period.


In a preferred embodiment, the frame frequency is sufficiently high such that eyes cannot sense that the first and second imaging devices and their light sources are intermittently powered on and off.


In accordance with a further aspect of the invention, a method for reducing an effect of a video artifact includes adjusting a phase of a second imaging device's video clock signal so that a phase of the second imaging device's video synchronization signal matches a phase of a first imaging device's video synchronization signal.


In a preferred embodiment, the method further includes adjusting a frequency of the second imaging device's video clock signal so that a frequency of the second imaging device's video synchronization signal matches a frequency of the first imaging device's video synchronization signal.


In another preferred embodiment, the method further includes synchronizing a duty cycle of a light source to turn on the light source only when the first imaging device is in a vertical blanking interval to reduce the size of the artifact, wherein the light source faces the first imaging device.


In yet another preferred embodiment, the method further includes moving the artifact by adjusting a pulse width and/or delay timing of the light source. Preferably, the moving step includes moving the artifact vertically.


In still yet another preferred embodiment, the adjusting step includes extracting a vertical synchronization signal from the video synchronization signal of the first imaging device and another vertical synchronization signal from the video synchronization signal of the second imaging device.


In a further preferred embodiment, the adjusting step includes computing the phase difference between the vertical synchronization signals using the vertical synchronization signal of the first imaging device as a reference signal.


In a still further preferred embodiment, the adjusting step includes averaging the phase difference to reduce the noise content of the phase difference.


In a yet further preferred embodiment, the adjusting step includes creating the video clock signal.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a perspective view of an endoscope with an imaging assembly according to one embodiment of the present invention.



FIG. 2 shows a perspective view of the distal end of an insertion tube of the endoscope of FIG. 1.



FIG. 3 shows a perspective view of the imaging assembly shown in FIG. 1.



FIG. 4 shows a perspective view of the distal ends of the endoscope and imaging assembly of FIG. 1.



FIG. 5 shows a schematic diagram of a controller that, together with the endoscope of FIG. 1, forms an endoscope system.





DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION


FIG. 1 illustrates an exemplary endoscope 10 of the present invention. This endoscope 10 can be used in a variety of medical procedures in which imaging of a body tissue, organ, cavity or lumen is required. The types of procedures include, for example, anoscopy, arthroscopy, bronchoscopy, colonoscopy, cystoscopy, EGD, laparoscopy, and sigmoidoscopy.


The endoscope 10 of FIG. 1 includes an insertion tube 12 and an imaging assembly 14, a section of which is housed inside the insertion tube 12. As shown in FIG. 2, the insertion tube 12 has two longitudinal channels 16. In general, however, the insertion tube 12 may have any number of longitudinal channels. An instrument can reach the body cavity through one of the channels 16 to perform any desired procedures, such as to take samples of suspicious tissues or to perform other surgical procedures such as polypectomy. The instruments may be, for example, a retractable needle for drug injection, hydraulically actuated scissors, clamps, grasping tools, electrocoagulation systems, ultrasound transducers, electrical sensors, heating elements, laser mechanisms and other ablation means. In some embodiments, one of the channels can be used to supply a washing liquid such as water for washing. Another or the same channel may be used to supply a gas, such as CO2 or air into the organ. The channels 16 may also be used to extract fluids or inject fluids, such as a drug in a liquid carrier, into the body. Various biopsy, drug delivery, and other diagnostic and therapeutic devices may also be inserted via the channels 16 to perform specific functions.


The insertion tube 12 preferably is steerable or has a steerable distal end region 18 as shown in FIG. 1. The length of the distal end region 18 may be any suitable fraction of the length of the insertion tube 12, such as one half, one third, one fourth, one sixth, one tenth, or one twentieth. The insertion tube 12 may have control cables (not shown) for the manipulation of the insertion tube 12. Preferably, the control cables are symmetrically positioned within the insertion tube 12 and extend along the length of the insertion tube 12. The control cables may be anchored at or near the distal end 36 of the insertion tube 12. Each of the control cables may be a Bowden cable, which includes a wire contained in a flexible overlying hollow tube. The wires of the Bowden cables are attached to controls 20 in the handle 22. Using the controls 20, the wires can be pulled to bend the distal end region 18 of the insertion tube 12 in a given direction. The Bowden cables can be used to articulate the distal end region 18 of the insertion tube 12 in different directions.


As shown in FIG. 1, the endoscope 10 may also include a control handle 22 connected to the proximal end 24 of the insertion tube 12. Preferably, the control handle 22 has one or more ports and/or valves (not shown) for controlling access to the channels 16 of the insertion tube 12. The ports and/or valves can be air or water valves, suction valves, instrumentation ports, and suction/instrumentation ports. As shown in FIG. 1, the control handle 22 may additionally include buttons 26 for taking pictures with an imaging device on the insertion tube 12, the imaging assembly 14, or both. The proximal end 28 of the control handle 22 may include an accessory outlet 30 (FIG. 1) that provides fluid communication between the air, water and suction channels and the pumps and related accessories. The same outlet 30 or a different outlet can be used for electrical lines to light and imaging components at the distal end of the endoscope 10.


As shown in FIG. 2, the endoscope 10 may further include an imaging device 32 and light sources 34, both of which are disposed at the distal end 36 of the insertion tube 12. The imaging device 32 may include, for example, a lens, single chip sensor, multiple chip sensor or fiber optic implemented devices. The imaging device 32, in electrical communication with a processor and/or monitor, may provide still images or recorded or live video images. The light sources 34 preferably are equidistant from the imaging device 32 to provide even illumination. The intensity of each light source 34 can be adjusted to achieve optimum imaging. The circuits for the imaging device 32 and light sources 34 may be incorporated into a printed circuit board (PCB).


As shown in FIGS. 3 and 4, the imaging assembly 14 may include a tubular body 38, a handle 42 connected to the proximal end 40 of the tubular body 38, an auxiliary imaging device 44, a link 46 that provides physical and/or electrical connection between the auxiliary imaging device 44 to the distal end 48 of the tubular body 38, and an auxiliary light source 50 (FIG. 4). The auxiliary light source 50 may be an LED device.


As shown in FIG. 4, the imaging assembly 14 of the endoscope 10 is used to provide an auxiliary imaging device at the distal end of the insertion tube 12. To this end, the imaging assembly 14 is placed inside one of the channels 16 of the endoscope's insertion tube 12 with its auxiliary imaging device 44 disposed beyond the distal end 36 of the insertion tube 12. This can be accomplished by first inserting the distal end of the imaging assembly 14 into the insertion tube's channel 16 from the endoscope's handle 18 and then pushing the imaging assembly 14 further into the assembly 14 until the auxiliary imaging device 44 and link 46 of the imaging assembly 14 are positioned outside the distal end 36 of the insertion tube 12 as shown in FIG. 4.


Each of the main and auxiliary imaging devices 32, 44 may be an electronic device which converts light incident on photosensitive semiconductor elements into electrical signals. The imaging sensor may detect either color or black-and-white images. The signals from the imaging sensor can be digitized and used to reproduce an image that is incident on the imaging sensor. Two commonly used types of image sensors are Charge Coupled Devices (CCD) such as a VCC-5774 produced by Sanyo of Osaka, Japan and Complementary Metal Oxide Semiconductor (CMOS) camera chips such as an OVT 6910 produced by OnmiVision of Sunnyvale, Calif. Preferably, the main imaging device 32 is a CCD imaging device, and the auxiliary imaging device 44 is a CMOS imaging device.


When the imaging assembly 14 is properly installed in the insertion tube 12, the auxiliary imaging device 44 of the imaging assembly 14 preferably faces backwards towards the main imaging device 32 as illustrated in FIG. 4. The auxiliary imaging device 44 may be oriented so that the auxiliary imaging device 44 and the main imaging device 32 have adjacent or overlapping viewing areas. Alternatively, the auxiliary imaging device 44 may be oriented so that the auxiliary imaging device 44 and the main imaging device 32 simultaneously provide different views of the same area. Preferably, the auxiliary imaging device 44 provides a retrograde view of the area, while the main imaging device 32 provides a front view of the area. However, the auxiliary imaging device 44 could be oriented in other directions to provide other views, including views that are substantially parallel to the axis of the main imaging device 32.


As shown in FIG. 4, the link 46 connects the auxiliary imaging device 44 to the distal end 48 of the tubular body 38. Preferably, the link 46 is a flexible link that is at least partially made from a flexible shape memory material that substantially tends to return to its original shape after deformation. Shape memory materials are well known and include shape memory alloys and shape memory polymers. A suitable flexible shape memory material is a shape memory alloy such as nitinol. The flexible link 46 is straightened to allow the distal end of the imaging assembly 14 to be inserted into the proximal end of assembly 14 of the insertion tube 12 and then pushed towards the distal end 36 of the insertion tube 12. When the auxiliary imaging device 44 and flexible link 46 are pushed sufficiently out of the distal end 36 of the insertion tube 12, the flexible link 46 resumes its natural bent configuration as shown in FIG. 3. The natural configuration of the flexible link 46 is the configuration of the flexible link 46 when the flexible link 46 is not subject to any force or stress. When the flexible link 46 resumes its natural bent configuration, the auxiliary imaging device 44 faces substantially back towards the distal end 36 of the insertion tube 12 as shown in FIG. 5.


In the illustrated embodiment, the auxiliary light source 50 of the imaging assembly 14 is placed on the flexible link 46, in particular on the curved concave portion of the flexible link 46. The auxiliary light source 50 provides illumination for the auxiliary imaging device 44 and may face substantially the same direction as the auxiliary imaging device 44 as shown in FIG. 4.


The endoscope of the present invention, such as the endoscope 10 shown in FIG. 1, may be part of an endoscope system that also includes a controller. The term “controller” as used in this specification is broadly defined. In some embodiments, for example, the term “controller” may simply be a signal processing unit.


The controller can be used for, among others, reducing or eliminating the “thin line” artifacts described above. FIG. 5 illustrates a preferred embodiment 52 of the controller. The preferred controller 52 includes a phase lock loop (PLL) circuit 54. The PLL circuit 54 includes a sync separator 56, a phase detector 58, a low pass filter 60, and an oscillator 62.


The sync separator 56 is connected to each of the main and auxiliary imaging devices 32, 44 to receive a video synchronization signal 64 from each imaging device 32, 44. The sync separator 56 extracts a vertical synchronization signal from each video synchronization signal 64. The phase detector 58 is connected to the sync separator 56 and receives the vertical synchronization signals from the sync separator 56. The phase detector 56 then computes the phase difference between the vertical synchronization signals using the vertical synchronization signal of the main imaging device 32 as the reference signal. The low pass filter 60 is connected to the phase detector 58 and receives the phase difference from the phase detector 58. The low pass filter 60 averages the phase difference to reduce the noise content of the phase difference. The oscillator 62 is connected to the low pass filter 60 and receives the averaged phase difference. Based on the averaged phase difference, the oscillator 62 creates an output signal that matches the frequency and phase of the vertical synchronization signal of the main imaging device 32. This output signal of the PLL circuit 54 may be then amplified and sent to the auxiliary imaging device 44 as a video clock signal 66. This feedback control loop adjusts the phase and/or frequency of the auxiliary imaging device's video clock so that the phase and frequency of the auxiliary imaging device's video synchronization signal 64 match those of the main imaging device 32's video synchronization signal 64. In other words, the two imaging devices 32, 44 have the same frame frequency and frame phase.


The preferred controller 52 shown in FIG. 5 may also include an auxiliary light source driver 68 that is used to “pulse” the auxiliary light source 50. A “pulsed” light source is not constantly powered on. Instead, it is turned on and off intermittently at a certain frequency. The frequency, phase and duty cycle (pulse width) of the auxiliary light source 50 can be adjusted by the auxiliary light source driver 68. in addition, the output signal of the PLL circuit 54 may also be sent to the auxiliary light source driver 68 to match the frequency of the auxiliary light source 50 with that of the imaging devices 32, 44.


The inventors of the present application have discovered that the size and position of the artifact on an image produced by the main imaging device 32 may be adjusted by varying at least the duty cycle of the auxiliary light source 50 and by varying at least the phase of the auxiliary light source 50 relative to the imaging devices 32, 44. For example, the duty cycle of the auxiliary light source 50 may be adjusted to vary at least the size of the artifact. In particular, the size of the artifact may be reduced by decreasing the duty cycle of the auxiliary light source 50. In the case of the “thin line” artifact, the length of the artifact may be reduced by decreasing the duty cycle of the auxiliary light source 50.


For another example, the artifact on an image produced by the main imaging device 32 may be moved, such as vertically, by varying at least the phase of the auxiliary light source 50 relative to the imaging devices 32, 44. The controller 52 may allow a user to adjust the phase of the auxiliary light source 50 to move the artifact to a region of non-interest in the image such as the location of the auxiliary light source 50.


For a further example, the duty cycle and/or phase of the auxiliary light source 50 may be adjusted so that the auxiliary light source 50 is powered on only when the main imaging device 32 is in a vertical blanking interval, resulting in a reduction in the size of the artifact.


Similarly, using the above-described processes and devices, the size and/or position of an artifact on an image produced by the auxiliary imaging device 44 may be adjusted by varying at least the duty cycle of the main light source 34 and by varying at least the phase of the main light source 34 relative to the imaging devices 32, 44.


In addition, an artifact on an image produced by one of the image devices 32, 44 may also be minimized by introducing a phase difference between the video synchronization signals of the main and auxiliary imaging devices 32, 44 (i.e., introducing a phase delay between the frame rates of the two video signals). The PLL circuit 54 may be used to maintain the desired phase difference between the video synchronization signals. The controller 52 may be used to adjust the phase difference between the video synchronization signals to minimize the artifact.


The auxiliary imaging device 44 and its light source 50 may be connected to the controller 52 (not shown) via electrical conductors that extend from the imaging device 44 and light source 50; through the link 46, tubular body 38, and handle 42; to the controller 52. The electrical conductors may carry power and control commands to the auxiliary imaging device 44 and its light source 50 and image signals from the auxiliary imaging device 44 to the controller 52.


The controller 52 may be used to adjust the parameters of the imaging devices 32, 44 and their light sources 34, 50, such as brightness, exposure time and mode settings. The adjustment can be done by writing digital commands to specific registers controlling the parameters. The registers can be addressed by their unique addresses, and digital commands can be read from and written to the registers to change the various parameters. The controller 52 can change the register values by transmitting data commands to the registers.


In an alternate embodiment, the controller 52 may be used to reduce light interference between the main imaging device 32 and light source 34 and the auxiliary imaging device 44 and light source 50. Since the main imaging device 32 and light source 34 face the auxiliary imaging device 44 and light source 50, the main light source 34 interferes with the auxiliary imaging device 44, and the auxiliary light source 50 interferes with the main imaging device 32. Light interference is the result of the light from a light source being projected directly onto an imaging device. This may cause light glare, camera blooming, or over saturation of light, resulting in inferior image quality.


To reduce or eliminate light interference, the imaging devices 32, 44 and their light sources 34, 50 may be turned on and off alternately. In other words, when the main imaging device 32 and light source 34 are turned on, the auxiliary imaging device 44 and light source 50 are turned off. And when the main imaging device 32 and light sources 34 are turned off, the auxiliary imaging device 44 and light source 50 are turned on. Preferably, the imaging devices 32, 44 and their light sources 34, 50 are turned on and off at a sufficiently high frequency that eyes do not sense that the light sources 34, 50 are being turned on and off.


The timing of powering on and off the imaging devices 32, 44 and their light sources 34, 50 can be accomplished using the PLL circuit 54 shown in FIG. 5. The PLL circuit 54 may be employed to match the frame frequencies and phases of the imaging devices 32, 44 as discussed above. Then, the main imaging device 32 and light source 34 are powered on during one half of the frame period, and the auxiliary imaging device 44 and light source 50 are powered on during the other half of the frame period.


The above-described processes and devices may also be used when there are more than two imaging devices and two light sources and when the imaging devices and light sources are on two or more endoscopes.

Claims
  • 1. A controller configured for use with an endoscopic system, the controller comprising: a phase lock loop circuit that is configured to receive a video synchronization signal from a first imaging sensor and to receive a video synchronization signal from a second imaging sensor to generate a video clock signal to send to the second imaging sensor, wherein the video clock signal is generated such that image frames of the imaging sensors have the same fundamental frequency and are phase locked, and wherein the first imaging sensor and a first light source face the second imaging sensor and a second light source; anda light source driver adapted to be connected to the second light source, wherein the light source driver is further connected to the phase lock loop circuit to receive the video clock signal, and wherein the first imaging sensor and the first light source are configured to be activated during a first time interval, and wherein the second imaging sensor and second light source are configured to be activated during a second time interval that is distinct from the first time interval.
  • 2. The controller of claim 1, wherein the controller is configured to lock a selected phase difference between the video clock signal and the first imaging sensor's video synchronization signal.
  • 3. The controller of claim 1, wherein the light source driver is configured to drive the second light source for only a portion of a full duty cycle.
  • 4. The controller of claim 1, wherein the light source driver is configured to adjust the location of an image artifact of the first imaging sensor by adjusting a phase difference between the video clock signal and the video synchronization signal of the first imaging sensor.
  • 5. The controller of claim 1, wherein the light source driver is configured to decrease the duty cycle of the second light source to reduce the size of an image artifact of the first imaging sensor.
  • 6. The sensor of claim 1, wherein the light source driver is configured to activate the second light source only when the first imaging sensor is in a vertical blanking interval.
  • 7. An endoscopic system comprising: a first imaging sensor;a first light source;a second imaging sensor;a second light source, wherein the first imaging sensor and first light source face the second imaging sensor and second light source; anda controller including a phase lock loop circuit that is configured to receive a video synchronization signal from the first imaging sensor and to receive a video synchronization signal from the second imaging sensor to generate and adjust a video clock signal such that the second light source and the image frames of the first and second imaging sensors have the same fundamental frequency and are phase locked, wherein the first imaging sensor and first light source are activated during a first interval of the frame period, and wherein the second imaging sensor and second light source are activated during a second interval of the frame period that is distinct from the first interval.
  • 8. The endoscopic system of claim 7, wherein a phase difference between the second imaging sensor's video synchronization signal and the first imaging sensor's video synchronization signal is not zero.
  • 9. The endoscopic system of claim 7, wherein the controller is configured to adjust the location of an image artifact of the first image sensor.
  • 10. The endoscopic system of claim 7, wherein the controller is configured to reduce the size of an image artifact of the first imaging sensor by decreasing the duty cycle of the second light source.
  • 11. The endoscopic system of claim 7, wherein the second light source is activated only when the first imaging sensor is in a vertical blanking interval.
  • 12. The endoscopic system of claim 7, wherein the controller further comprises a light source driver adapted to be connected to the second light source, wherein the light source driver is connected to the phase lock loop circuit to receive the video clock signal and configured to activate the second light source relative to the video clock signal.
  • 13. The endoscopic system of claim 12, wherein the light source driver is configured to adjust the location of an image artifact of the first imaging sensor by adjusting a phase difference between the video clock signal and the video synchronization signal of the first imaging sensor.
  • 14. The endoscopic system of claim 13, wherein the light source driver is configured to move the artifact vertically.
  • 15. The endoscopic system of claim 12, wherein the light source driver is configured to reduce the size of an imaging artifact of the first imaging sensor by decreasing the duty cycle of the second light source.
  • 16. The endoscopic system of claim 12, wherein the light source driver is configured to activate the second light source only when the first imaging sensor is in a vertical blanking interval.
  • 17. The system of claim 7, wherein the frame frequency is sufficiently high such that eyes cannot sense that the first and second imaging sensors and their light sources are intermittently activated.
  • 18. The endoscopic system of claim 7, wherein the controller is configured to lock a selected phase difference between the video clock signal and the video synchronization signal of the first imaging sensor.
  • 19. The endoscopic system of claim 7, further comprising a light source driver, wherein the light source driver is connected to the phase lock loop circuit to receive the video clock signal.
  • 20. The endoscopic system of claim 19, wherein the light source driver is configured to adjust the location of an image artifact of the first imaging sensor by adjusting a phase difference between the video clock signal and the video synchronization signal of the first imaging sensor.
  • 21. The endoscopic system of claim 19, wherein the light source driver activates the second light source for only a portion of a full duty cycle.
  • 22. The endoscopic system of claim 7, wherein the first light source is configured to be activated only when the first imaging sensor is acquiring an image, and wherein the second light source is configured to be activated only when the second imaging sensor is acquiring an image.
  • 23. The endoscopic system of claim 7, wherein the controller is configured such that the first imaging sensor acquires an image and the first light source is activated during one half of the frame period, and wherein the second imaging sensor acquires an image and the second light source is activated during the other half of the frame period.
  • 24. The endoscopic system of claim 19, wherein the light source driver is configured to reduce a size of an image artifact of the first imaging sensor by decreasing the duty cycle of the second light source.
  • 25. A method for reducing light interference between a first imaging sensor and a second light source and a second imaging sensor and a first light source of an endoscopic system, wherein the first imaging sensor and the first light source face the second imaging sensor and the second light source, the method comprising: using a phase lock loop circuit to receive a video synchronization signal from the first imaging sensor and to receive a video synchronization signal from the second imaging sensor and to generate and adjust a video clock signal for the second imaging sensor so that image frames of the first and second imaging sensors have the same fundamental frequency and are phase locked; andactivating the first imaging sensor and first light source during a first interval of the frame period, and activating the second imaging sensor and second light source during a second interval of the frame period that is distinct from the first interval.
  • 26. The method of claim 25, further comprising locking a selected phase difference between the video clock signal and the first imaging sensor's video synchronization signal.
  • 27. The method of claim 25, further comprising adjusting the location of an image artifact of the first imaging sensor by adjusting a phase difference between the video clock signal and the video synchronization signal of the first imaging sensor.
  • 28. The method of claim 27, wherein adjusting the phase difference moves the artifact vertically.
  • 29. The method of claim 25, further comprising reducing the size of an image artifact of the first imaging sensor by decreasing the duty cycle of the second light source.
  • 30. The method of claim 25, wherein the second light source is activated only when the first imaging sensor is in a vertical blanking interval.
Parent Case Info

This application claims the benefit of U.S. Provisional Patent Application No. 60/801,748, filed May 19, 2006, the entire disclosure of which is incorporated herein by reference.

US Referenced Citations (323)
Number Name Date Kind
3437747 Sheldon Apr 1969 A
3610231 Takahashi et al. Oct 1971 A
3643653 Takahashi et al. Feb 1972 A
3739770 Mori Jun 1973 A
3889662 Mitsui Jun 1975 A
3897775 Furihata Aug 1975 A
3918438 Hayamizu et al. Nov 1975 A
4261344 Moore et al. Apr 1981 A
4351587 Matsuo et al. Sep 1982 A
4398811 Nishioka et al. Aug 1983 A
4494549 Namba et al. Jan 1985 A
4573450 Arakawa Mar 1986 A
4586491 Carpenter May 1986 A
4625236 Fujimori et al. Nov 1986 A
4646722 Silverstein et al. Mar 1987 A
4699463 D'Amelio et al. Oct 1987 A
4721097 D'Amelio Jan 1988 A
4727859 Lia Mar 1988 A
4741326 Sidall et al. May 1988 A
4790295 Tashiro Dec 1988 A
4800870 Reid, Jr. Jan 1989 A
4825850 Opie et al. May 1989 A
4836211 Sekino et al. Jun 1989 A
4846154 MacAnally et al. Jul 1989 A
4852551 Opie et al. Aug 1989 A
4853773 Hibino et al. Aug 1989 A
4862873 Yajima et al. Sep 1989 A
4867138 Kubota et al. Sep 1989 A
4869238 Opie et al. Sep 1989 A
4870488 Ikuno et al. Sep 1989 A
4873572 Miyazaki et al. Oct 1989 A
4873965 Danieli Oct 1989 A
4884133 Kanno et al. Nov 1989 A
4899732 Cohen Feb 1990 A
4905667 Foerster et al. Mar 1990 A
4907395 Opie et al. Mar 1990 A
4911148 Sosnowski et al. Mar 1990 A
4911564 Baker Mar 1990 A
4926258 Sasaki May 1990 A
4947827 Opie et al. Aug 1990 A
4947828 Carpenter et al. Aug 1990 A
4979496 Komi Dec 1990 A
4991565 Takahashi et al. Feb 1991 A
5019040 Itaoka et al. May 1991 A
5025778 Silverstein et al. Jun 1991 A
5026377 Burton et al. Jun 1991 A
5050585 Takahashi Sep 1991 A
RE34100 Opie et al. Oct 1992 E
RE34110 Opie et al. Oct 1992 E
5159446 Hibino et al. Oct 1992 A
5166787 Irion Nov 1992 A
5178130 Kaiya et al. Jan 1993 A
5187572 Nakamura et al. Feb 1993 A
5193525 Silverstein et al. Mar 1993 A
5196928 Karasawa et al. Mar 1993 A
5253638 Tamburrino et al. Oct 1993 A
5260780 Staudt, III Nov 1993 A
5271381 Ailinger et al. Dec 1993 A
5305121 Moll Apr 1994 A
5318031 Mountford et al. Jun 1994 A
5329887 Ailinger et al. Jul 1994 A
5337734 Saab Aug 1994 A
5381784 Adair Jan 1995 A
5398685 Wilk et al. Mar 1995 A
5406938 Mersch et al. Apr 1995 A
5434669 Tabata et al. Jul 1995 A
5443781 Saab Aug 1995 A
5447148 Oneda et al. Sep 1995 A
5483951 Frassica et al. Jan 1996 A
5494483 Adair Feb 1996 A
5518501 Oneda et al. May 1996 A
5520607 Frassica et al. May 1996 A
5530238 Meulenbrugge et al. Jun 1996 A
5533496 De Faria-Correa et al. Jul 1996 A
5536236 Yabe et al. Jul 1996 A
5556367 Yabe et al. Sep 1996 A
5613936 Czarnek et al. Mar 1997 A
5614943 Nakamura et al. Mar 1997 A
5626553 Frassica et al. May 1997 A
5634466 Gruner Jun 1997 A
5653677 Okada et al. Aug 1997 A
5667476 Frassica et al. Sep 1997 A
5679216 Takayama et al. Oct 1997 A
5681260 Ueda et al. Oct 1997 A
5682199 Lankford Oct 1997 A
5685822 Harhen Nov 1997 A
5692729 Harhen Dec 1997 A
5696850 Parulski et al. Dec 1997 A
5702348 Harhen Dec 1997 A
5706128 Greenberg Jan 1998 A
5711299 Manwaring et al. Jan 1998 A
5722933 Yabe et al. Mar 1998 A
5752912 Takahashi et al. May 1998 A
5762603 Thompson Jun 1998 A
5817061 Goodwin et al. Oct 1998 A
5827177 Oneda et al. Oct 1998 A
5833603 Kovacs et al. Nov 1998 A
5843103 Wulfman Dec 1998 A
5843460 Labigne et al. Dec 1998 A
5860914 Chiba et al. Jan 1999 A
5876329 Harhen Mar 1999 A
5916147 Boury Jun 1999 A
5924977 Yabe et al. Jul 1999 A
5938587 Taylor et al. Aug 1999 A
5982932 Prokoski Nov 1999 A
5989182 Hori et al. Nov 1999 A
5989224 Exline et al. Nov 1999 A
6017358 Yoon Jan 2000 A
6026323 Skladnev et al. Feb 2000 A
6066090 Yoon May 2000 A
6099464 Shimizu et al. Aug 2000 A
6099466 Sano et al. Aug 2000 A
6099485 Patterson Aug 2000 A
6106463 Wilk Aug 2000 A
6174280 Oneda et al. Jan 2001 B1
6190330 Harhen Feb 2001 B1
6214028 Yoon et al. Apr 2001 B1
6261226 McKenna et al. Jul 2001 B1
6261307 Yoon et al. Jul 2001 B1
6277064 Yoon Aug 2001 B1
6296608 Daniels et al. Oct 2001 B1
6301047 Hoshino et al. Oct 2001 B1
6350231 Ailinger et al. Feb 2002 B1
6369855 Chauvel et al. Apr 2002 B1
6375653 Desai Apr 2002 B1
6387043 Yoon May 2002 B1
6433492 Buonavita Aug 2002 B1
6456684 Mun et al. Sep 2002 B1
6461294 Oneda et al. Oct 2002 B1
6482149 Torii Nov 2002 B1
6527704 Chang et al. Mar 2003 B1
6547724 Soble et al. Apr 2003 B1
6554767 Tanaka Apr 2003 B2
6564088 Soller et al. May 2003 B1
6640017 Tsai et al. Oct 2003 B1
6648816 Irion et al. Nov 2003 B2
6683716 Costales Jan 2004 B1
6687010 Horii et al. Feb 2004 B1
6697536 Yamada Feb 2004 B1
6699180 Kobayashi Mar 2004 B2
6736773 Wendlandt et al. May 2004 B2
6748975 Hartshorne et al. Jun 2004 B2
6796939 Hirata et al. Sep 2004 B1
6833871 Merrill et al. Dec 2004 B1
6845190 Smithwick et al. Jan 2005 B1
6891977 Gallagher May 2005 B2
6916286 Kazakevich Jul 2005 B2
6928314 Johnson et al. Aug 2005 B1
6929636 von Alten Aug 2005 B1
6947784 Zalis Sep 2005 B2
6951536 Yokoi et al. Oct 2005 B2
6965702 Gallagher Nov 2005 B2
6966906 Brown Nov 2005 B2
6974240 Takahashi Dec 2005 B2
6974411 Belson Dec 2005 B2
6997871 Sonnenschein et al. Feb 2006 B2
7004900 Wendlandt et al. Feb 2006 B2
7029435 Nakao Apr 2006 B2
7041050 Ronald May 2006 B1
7095548 Cho et al. Aug 2006 B1
7103228 Kraft et al. Sep 2006 B2
7116352 Yaron Oct 2006 B2
7173656 Dunton et al. Feb 2007 B1
7228004 Gallagher et al. Jun 2007 B2
7280141 Frank et al. Oct 2007 B1
7317458 Wada Jan 2008 B2
7322934 Miyake et al. Jan 2008 B2
7341555 Ootawara et al. Mar 2008 B2
7362911 Frank Apr 2008 B1
7389892 Park Jun 2008 B2
7405877 Schechterman Jul 2008 B1
7435218 Krattiger et al. Oct 2008 B2
7436562 Nagasawa et al. Oct 2008 B2
7507200 Okada Mar 2009 B2
7551196 Ono et al. Jun 2009 B2
7556599 Rovegno Jul 2009 B2
7561190 Deng et al. Jul 2009 B2
7621869 Ratnakar Nov 2009 B2
7646520 Funaki et al. Jan 2010 B2
7678043 Gilad Mar 2010 B2
7683926 Schechterman et al. Mar 2010 B2
7749156 Ouchi Jul 2010 B2
7825964 Hoshino et al. Nov 2010 B2
7864215 Carlsson et al. Jan 2011 B2
7910295 Hoon et al. Mar 2011 B2
7927272 Bayer et al. Apr 2011 B2
8009167 Dekel et al. Aug 2011 B2
8064666 Bayer Nov 2011 B2
8070743 Kagan et al. Dec 2011 B2
20010007468 Sugimoto et al. Jul 2001 A1
20010037052 Higuchi et al. Nov 2001 A1
20010051766 Gazdinski Dec 2001 A1
20010056238 Tsujita Dec 2001 A1
20020026188 Balbierz et al. Feb 2002 A1
20020039400 Kaufman et al. Apr 2002 A1
20020089584 Abe Jul 2002 A1
20020095168 Griego et al. Jul 2002 A1
20020099267 Wendlandt et al. Jul 2002 A1
20020101546 Sharp et al. Aug 2002 A1
20020110282 Kraft et al. Aug 2002 A1
20020115908 Farkas et al. Aug 2002 A1
20020156347 Kim et al. Oct 2002 A1
20020193662 Belson Dec 2002 A1
20030004399 Belson Jan 2003 A1
20030011768 Jung et al. Jan 2003 A1
20030032863 Kazakevich Feb 2003 A1
20030040668 Kaneko et al. Feb 2003 A1
20030045778 Ohline et al. Mar 2003 A1
20030065250 Chiel et al. Apr 2003 A1
20030088152 Takada May 2003 A1
20030093031 Long et al. May 2003 A1
20030093088 Long et al. May 2003 A1
20030103199 Jung et al. Jun 2003 A1
20030105386 Voloshin et al. Jun 2003 A1
20030120130 Glukhovsky Jun 2003 A1
20030125630 Furnish Jul 2003 A1
20030125788 Long Jul 2003 A1
20030130711 Pearson et al. Jul 2003 A1
20030153866 Long et al. Aug 2003 A1
20030161545 Gallagher Aug 2003 A1
20030167007 Belson Sep 2003 A1
20030171650 Tartaglia et al. Sep 2003 A1
20030176767 Long et al. Sep 2003 A1
20030179302 Harada et al. Sep 2003 A1
20030187326 Chang Oct 2003 A1
20030195545 Hermann et al. Oct 2003 A1
20030197781 Sugimoto et al. Oct 2003 A1
20030197793 Mitsunaga et al. Oct 2003 A1
20030216727 Long Nov 2003 A1
20030225433 Nakao Dec 2003 A1
20030233115 Eversull et al. Dec 2003 A1
20040023397 Vig et al. Feb 2004 A1
20040034278 Adams Feb 2004 A1
20040049096 Adams Mar 2004 A1
20040059191 Krupa et al. Mar 2004 A1
20040080613 Moriyama Apr 2004 A1
20040097790 Farkas et al. May 2004 A1
20040109164 Horii et al. Jun 2004 A1
20040109319 Takahashi Jun 2004 A1
20040111019 Long Jun 2004 A1
20040122291 Takahashi Jun 2004 A1
20040141054 Mochida et al. Jul 2004 A1
20040158124 Okada Aug 2004 A1
20040207618 Williams et al. Oct 2004 A1
20040242987 Liew et al. Dec 2004 A1
20050010084 Tsai Jan 2005 A1
20050014996 Konomura et al. Jan 2005 A1
20050020918 Wilk et al. Jan 2005 A1
20050020926 Wiklof et al. Jan 2005 A1
20050038317 Ratnakar Feb 2005 A1
20050038319 Goldwasser et al. Feb 2005 A1
20050068431 Mori Mar 2005 A1
20050085693 Belson et al. Apr 2005 A1
20050085790 Guest et al. Apr 2005 A1
20050096502 Khalili May 2005 A1
20050154278 Cabiri et al. Jul 2005 A1
20050165272 Okada et al. Jul 2005 A1
20050165279 Adler et al. Jul 2005 A1
20050177024 Mackin Aug 2005 A1
20050203420 Kleen et al. Sep 2005 A1
20050215911 Alfano et al. Sep 2005 A1
20050222500 Itoi Oct 2005 A1
20050228224 Okada et al. Oct 2005 A1
20050267361 Younker et al. Dec 2005 A1
20050272975 McWeeney et al. Dec 2005 A1
20050272977 Saadat et al. Dec 2005 A1
20060044267 Xie et al. Mar 2006 A1
20060052709 DeBaryshe et al. Mar 2006 A1
20060058584 Hirata Mar 2006 A1
20060106286 Wendlandt et al. May 2006 A1
20060149127 Seddiqui et al. Jul 2006 A1
20060149129 Watts et al. Jul 2006 A1
20060183975 Saadat et al. Aug 2006 A1
20060184037 Ince et al. Aug 2006 A1
20060217594 Ferguson Sep 2006 A1
20060279632 Anderson Dec 2006 A1
20060285766 Ali Dec 2006 A1
20060293562 Uchimura et al. Dec 2006 A1
20070015967 Boulais et al. Jan 2007 A1
20070015989 Desai et al. Jan 2007 A1
20070083081 Schlagenhauf et al. Apr 2007 A1
20070103460 Zhang et al. May 2007 A1
20070142711 Bayer et al. Jun 2007 A1
20070173686 Lin et al. Jul 2007 A1
20070177008 Bayer et al. Aug 2007 A1
20070177009 Bayer et al. Aug 2007 A1
20070183685 Wada et al. Aug 2007 A1
20070185384 Bayer et al. Aug 2007 A1
20070225552 Segawa et al. Sep 2007 A1
20070225734 Bell et al. Sep 2007 A1
20070238927 Ueno et al. Oct 2007 A1
20070244354 Bayer Oct 2007 A1
20070270642 Bayer et al. Nov 2007 A1
20070279486 Bayer et al. Dec 2007 A1
20070280669 Karim Dec 2007 A1
20070293720 Bayer Dec 2007 A1
20080021269 Tinkham et al. Jan 2008 A1
20080021274 Bayer et al. Jan 2008 A1
20080033450 Bayer et al. Feb 2008 A1
20080039693 Karasawa Feb 2008 A1
20080064931 Schena et al. Mar 2008 A1
20080065110 Duval et al. Mar 2008 A1
20080071291 Duval et al. Mar 2008 A1
20080079827 Hoshino et al. Apr 2008 A1
20080097292 Cabiri et al. Apr 2008 A1
20080114288 Whayne et al. May 2008 A1
20080130108 Bayer et al. Jun 2008 A1
20080154288 Belson Jun 2008 A1
20080199829 Paley et al. Aug 2008 A1
20080275298 Ratnakar Nov 2008 A1
20090015842 Leitgeb et al. Jan 2009 A1
20090023998 Ratnakar Jan 2009 A1
20090028407 Seibel et al. Jan 2009 A1
20090036739 Hadani Feb 2009 A1
20090049627 Kritzler Feb 2009 A1
20090082629 Dotan et al. Mar 2009 A1
20090105538 Van Dam et al. Apr 2009 A1
20090137867 Goto May 2009 A1
20090213211 Bayer et al. Aug 2009 A1
20090231419 Bayer Sep 2009 A1
20100217076 Ratnakar Aug 2010 A1
20110160535 Bayer et al. Jun 2011 A1
20110213206 Boutillette et al. Sep 2011 A1
Foreign Referenced Citations (70)
Number Date Country
1 628 603 Jun 2005 CN
1628603 Jun 2005 CN
196 26433 Jan 1998 DE
20 2006 017 173 Mar 2007 DE
0 586 162 Mar 1994 EP
1 570 778 Sep 2005 EP
1 769 720 Apr 2007 EP
711 949 Sep 1931 FR
49-130235 Dec 1974 JP
56-9712 Jan 1981 JP
56-56486 May 1981 JP
60-83636 May 1985 JP
60-111217 Jun 1985 JP
62-094312 Jun 1987 JP
63-309912 Dec 1988 JP
1-267514 Oct 1989 JP
1-172847 Dec 1989 JP
2-295530 Dec 1990 JP
3-159629 Jul 1991 JP
4-500768 Feb 1992 JP
5-285091 Nov 1993 JP
5-307144 Nov 1993 JP
5-341210 Dec 1993 JP
60-76714 Mar 1994 JP
6-130308 May 1994 JP
6-169880 Jun 1994 JP
7-352 Jan 1995 JP
7-354 Jan 1995 JP
7-021001 Apr 1995 JP
8-206061 Aug 1996 JP
7-136108 May 1998 JP
11-76150 Mar 1999 JP
2003-220023 Aug 2003 JP
2004-202252 Jul 2004 JP
2004-525717 Aug 2004 JP
2004-537362 Dec 2004 JP
2007-143580 Jun 2007 JP
WO 9315648 Aug 1993 WO
WO-9917542 Apr 1999 WO
WO-9930506 Jun 1999 WO
WO 02085194 Oct 2002 WO
WO-02094105 Nov 2002 WO
WO-02094105 Nov 2002 WO
WO-03013349 Feb 2003 WO
WO-03013349 Feb 2003 WO
WO-2006073676 Jul 2006 WO
WO-2006073725 Jul 2006 WO
WO-2006110275 Oct 2006 WO
WO-2006110275 Oct 2006 WO
WO-2007015241 Feb 2007 WO
WO-2007015241 Feb 2007 WO
WO-2007070644 Jun 2007 WO
WO-2007070644 Jun 2007 WO
WO-2007087421 Aug 2007 WO
WO-2007087421 Aug 2007 WO
WO-2007092533 Aug 2007 WO
WO-2007092533 Aug 2007 WO
WO-2007092636 Aug 2007 WO
WO-2007092636 Aug 2007 WO
WO-2007136859 Nov 2007 WO
WO-2007136859 Nov 2007 WO
WO-2007136879 Nov 2007 WO
WO-2007136879 Nov 2007 WO
WO-2007136879 Nov 2007 WO
WO-2009014895 Jan 2009 WO
WO-2009015396 Jan 2009 WO
WO-2009015396 Jan 2009 WO
WO-2009049322 Apr 2009 WO
WO-2009049322 Apr 2009 WO
WO-2009062179 May 2009 WO
Related Publications (1)
Number Date Country
20070279486 A1 Dec 2007 US
Provisional Applications (1)
Number Date Country
60801748 May 2006 US