The present invention generally relates to remote-head imaging, and more particularly, to an endoscopy device with a multi-purpose camera control unit that supports multiple input devices.
Remote-head imaging devices, and more particularly, endoscopes and video-endoscopes, are used in medical and industrial applications to view inside of cavities, bodily canals, hollow organs, and other remote locations. Typically, video-endoscopes consist of an input device, such as a distal-end (the end closest to the patient) camera on a rigid or flexible scope, that is attached to a camera control unit. The camera control unit typically supplies power to the camera, controls operation of the camera, receives raw video and non-video data from the camera, and outputs processed video data to a video display.
Conventional camera control units for video-endoscopes and remote-head imaging systems are limited in use, however, as they only support one type of input device. For example, a conventional camera control unit for a flexible scope with a distal-end camera would be unable to control a stereoscopic imaging head. In conventional systems, if a different input device is necessary for a certain application, it is also necessary to use a different camera control unit adapted for use with the specific input device.
The present invention addresses these shortcomings by providing a remote-head imaging system with a camera control unit that reconfigures itself and/or its internal functionality so as to support multiple different input devices.
In accordance with one aspect of the present invention, a camera control unit is provided for controlling operation of a sensor head and for processing camera data received from the sensor head. The camera control unit includes an electrical interface detachably connectable to sensor heads of multiple different sensor types, and a reconfigurable controller for timing and control of the electrical interface. The camera control unit also includes a system controller which obtains identification information from the sensor head and reconfigures the reconfigurable controller for timing and control of the electrical interface based on the identification information obtained from the sensor head.
In preferred aspects of the invention, the reconfigurable controller is a field-programmable gate array (“FPGA”).
In accordance with another aspect of the present invention, the camera control unit includes a reconfigurable controller for timing and control of the sensor head, for receiving camera data from the sensor head, and for directing the camera data along a data path. A digital signal processor performs an image processing operation on camera data on the data path. The reconfigurable controller may be constructed from a hardware device with programmable functionality, such as a FPGA. The camera control unit also includes persistent re-writeable memory for storing program instructions or code executable by the digital signal processor to perform the image processing operation, and configuration information for configuring the reconfigurable controller to perform the timing and control, and otherwise to change the functionality of the camera control unit. The camera control unit further includes a system controller for loading the program instructions into the digital signal processor and for configuring the reconfigurable controller in accordance with the configuration information. By utilizing a persistent re-writeable memory for storing program instructions for the digital signal processor and configuration information for the reconfigurable controller, the present invention allows for updating functionality of the camera control unit in the field.
According to another aspect of the present invention, the camera control unit includes a reconfigurable controller for timing and control of the sensor head, for receiving camera data from the sensor head, and for performing a pixel preprocessing operation on the received camera data. The camera control unit further includes a digital signal processor for performing an image processing operation on the received camera data. In addition, the camera control unit includes a persistent re-writeable memory for storing multiple sets of program instructions executable by the digital signal processor to perform an image processing operation, and for storing multiple sets of configuration information for configuring the reconfigurable controller to perform the timing and control and pixel preprocessing operation. An input device allows for the selection of a set of program instructions to be used by the digital signal processor and selection of the configuration information for the reconfigurable controller. The camera control unit also includes a system controller for loading the selected set of program instructions into the digital signal processor and for reconfiguring the reconfigurable controller in accordance with the selected configuration information. In this way, a user can select specific processing operations and hardware configurations to be performed by the camera control unit.
According to yet another aspect of the present invention, the camera control unit includes a reconfigurable display formatter for formatting a display of the processed camera data and for generating an output timing signal, a video format selection unit for selecting one of the plurality of video formats, and a system controller for reconfiguring the reconfigurable display formatter for formatting and generating the output timing signal in response to the selection of a video format within the video format selection unit.
According to another aspect of the present invention, the camera control unit includes a hardware clock for generating a clock signal, a digital signal processor (DSP) for receiving the hardware clock signal and for generating a DSP clock signal from the hardware clock signal, and a reconfigurable controller that receives sensor data based on a first clock signal and that outputs processed video data based on a second clock signal. The reconfigurable controller includes a reconfigurable logic array that receives the DSP clock signal and generates the first and second clock signals. In addition, a system controller is provided which reconfigures the logic array to generate selectably different first and second clock signals. In this way, input and output clock signals can be independently generated and adjusted from the same hardware clock signal.
According to still another aspect of the present invention, the camera control unit includes a reconfigurable controller, a digital signal processor, including an input memory and an output memory, for image-processing of camera data in the input memory and outputting the processed camera data to the output memory, and a video encoder. The reconfigurable controller receives camera data from the sensor head and routes the camera data to the input memory of the digital signal processor. In addition, the reconfigurable controller accesses the processed camera data from the output memory of the digital signal processor and routes the processed camera data to the video encoder. In this way, the digital signal processor can be devoted to image-processing of the camera data, since it is the reconfigurable controller that deposits data for processing in the input memory and retrieves processed data from the output memory.
In another aspect of the invention, a camera control unit is provided for controlling the operation of a remote-head input device, for receiving and processing digital camera data from the remote-head input device, and for outputting the processed data to a monitor. Typically, the remote-head input device used with the camera control unit of the present invention is an electronic video-endoscope, or a snap-on type camera head configured to be detachably mounted to the eyepiece of a conventional endoscope.
One feature of the camera control unit is that the camera control unit adapts to multiple different types of camera heads. For example, the camera heads may contain sensors that vary in size, speed, or resolution. To adapt to these different camera heads, the camera control unit reconfigures its internal functionality by loading specific sets of software (program instructions) and firmware (as defined by configuration information) in response to the detection and recognition of an attached camera head.
Another feature of the invention is that the camera control unit reconfigures control circuitry using specific configuration information so that video can be output in a plurality of different formats. In this way, for example, both NTSC and PAL television standards can be supported.
Other features of the present invention include hardware acceleration, clock adjustability, user selectable configuration, and field programmable software and firmware.
This summary has been provided so that the nature of the invention may be understood quickly. A more complete understanding of the invention can be obtained by reference to the following detailed description, appended claims, and accompanying drawings.
Referring now to the drawings,
Flexible video endoscope 8 is connected to camera control unit 6 by attaching electrical connector 7A of umbilicus 4 to complementary connector 7B of camera control unit 6, which might be a card edge receptacle. Flexible video endoscope 8 includes distal tip 1, endoscope shaft assembly 2, endoscope body assembly 3, and sealed endoscope switches 5.
Distal tip 1 includes a camera head and a mechanical objective head which encloses and seals the camera head. Located in the camera head is an optical system, which includes an image sensor, sensor support electronics, an illumination end point, and forceps tubing. Preferably, the image sensor is a CMOS sensor, however, other sensors that capture image data and generate a digital output may also be utilized. The illumination end point may be implemented as an LED or the end of a fiber optic bundle.
Endoscope shaft assembly 2 houses an electrical wiring portion of the camera head, forceps tubing, and deflection pull wires. In addition, the endoscope shaft assembly may also contain a fiber optic illumination bundle, if used.
Endoscope body assembly 3 contains mechanical mechanisms used to activate the deflection system as well as support electronics for the camera head. Typically, the support electronics in endoscope body assembly 3 will include a permanent storage device, such as an EPROM, that stores camera parameters that define the type of camera head.
Sealed endoscope switches 5, keyboard 10, user interface 11 allow the user to control certain electrical and/or software features of the camera head. In addition, the switches on the endoscope may be used to select processing to be performed by the camera control unit. PC 12 and quick-swap memory device 13 allow the user to update and maintain the software and firmware within camera control unit 6.
The camera data path begins at sensor 301 located in camera head 300. In a preferred embodiment, sensor 301 is a CMOS sensor located at the distal tip of a video-endoscope. Also included in camera head 300 is illumination system 340. Power, illumination, and camera control, such as automatic gain control and exposure time, are provided by system controller 304 to sensor 301 over inter-integrated circuit (“I2C”) bus 330. The clock at which the sensor operates, sen_clk 391, is supplied by reconfigurable controller 302.
Each pixel of raw camera data generated by sensor 301 is transmitted to reconfigurable controller 302 at a gray-scale or color resolution of 8 or 10 bits, corresponding to 256 or 1024 gray-scale or color levels, respectively. Depending on the type of camera head, this raw camera data can be transmitted either in parallel, serial, or in a parallel/serial combination. In addition to raw camera data, and based on sen_clk signal 391 from reconfigurable controller 302, sensor 301 supplies a pixel clock (sen_pix_clk 392) as well as vertical and horizontal synchronization signals (sen_v_sync 394, sen_h_sync 395) to reconfigurable controller 302. The pixel clock determines the speed at which camera data pixels are clocked into reconfigurable controller 302. The vertical and horizontal synchronization signals indicate the start of each frame and line, respectively.
The raw camera data generated by sensor 301 is received by data formatter 306 in reconfigurable controller 302. Sensor clock generator 305 supplies sen_clk signal 391 to sensor 301, while sensor timing block 312 receives the pixel clock and synchronization signals from sensor 301. In addition, sensor timing block 312 generates image size control signals, such as start/end line and start end/pixel signals, based on the signals from sensor head 301. These signals may be used to specify a subset of the sensor area for which data is desired. For example, because of the configuration of the specific camera head, the sensor data might be windowed such that not all of the camera sensor data is used.
Together, the signals generated and received by sensor clock generator 305, data formatter 306, and sensor timing block 312 provide an electrical interface between an attached camera head sensor and camera control unit 6.
Data formatter 306 receives the raw camera data from sensor 301 at the rate of the pixel clock and formats the received pixels into a 10 bit parallel format. The data is then passed to pixel preprocessing block 307. At this point, pixel preprocessing may be performed, so as to compensate for sensor calibration and localized anomalies. Representative preprocessing is shown in
As shown in
As seen in
Referring again to
DSP 303 then performs an image processing operation on the received camera data as defined by program instructions stored in program memory 385. The program instructions are loaded into program memory 385 over serial port interface (“SPI”) 380 by system controller 304. The processed camera data is stored in internal output memory 384.
After DSP 303 has processed the camera data, HPI unit 310 again uses DMA protocol to access the processed data from output memory 384. This 32 bit data is decoded to 16 bits and stored in memory 309. Like memory 308, memory 309 is typically implemented as an asynchronous 16 bit FIFO register. The processed camera data is then passed to display formatter 311 which adjusts the output timing and control signals for a selected output format. The output format can be selected with a switch on user interface 11 or by a selection on keyboard 10. In addition, display formatter 311 adds overlay text to the processed camera data.
The processed camera data is then passed to video encoder 313 along with an encoder clock signal (enc_clk 396) generated by display formatter 311 in accordance with the selected output format. After encoding, the video data is sent to monitor 9.
System controller 304 monitors and controls the flow of data along the above-described data path. System controller 304 controls the operation of sensor 301 and is responsible for loading software run by DSP 303 and for reconfiguring reconfigurable controller 302. In a preferred embodiment, system controller 304 is implemented as a Motorola 9HCS12 controller, manufactured by Motorola of Schaumburg, Ill.
The following sections discuss how system controller 304 interacts with the other components of camera control unit 6 to achieve the features of multi-head adaptability, multiple output capability, hardware acceleration, clock adjustability, user selectable configuration, and field programmable software and firmware.
As mentioned above, one feature of the present invention lies in the ability of the camera control unit to reconfigure its functionality to adapt to different types of sensors and camera heads. In this way, one camera control unit may be used for multiple different heads.
When a remote-head input device, such as flexible video endoscope 8, is attached to camera control unit 6, system controller 304 receives a head detect signal over I2C bus 330. Step S402 proceeds to step S403 where camera parameters stored in a readable camera parameter storage device, such as EPROM 350 located in camera head 300, are read by system controller 304 over I2C bus 330. The camera parameters may include such data fields as camera type, version, serial number, image size type, image format type, white balance reference matrix, color correct matrix, intensity correct lookup tables, and bad pixels index list.
In step S404, based on the camera parameters obtained in step S403, system controller 304 consults internal lookup table 371 to determine what functionalities are available for the type of camera head attached. A camera parameter, such as camera type or serial number, is associated in lookup table 371 with one or more sets of configuration information and program instructions stored in a persistent re-writeable memory, such as a flash memory device or EEPROM. In
If only one set of configuration information and program instructions is available for use with the attached camera head, that set is selected and the head initialization process proceeds to step S405. If multiple sets of configuration information and program instructions are available for the attached camera head, step S404 prompts the user to input a selection of the desired functionality. The user prompt may appear on monitor 9 or an LCD on user interface 11. The user selection may be accomplished through the use of tactile buttons on user interface 11, keyboard 10, or endoscope switches 5.
In step S405, based on the selection of functionality in step S404, system controller 304 obtains the selected set of configuration information 360 from flash memory 317 and configures reconfigurable controller 302 accordingly. In a preferred embodiment, configuration information 360 is a compiled VHDL program (where “VHDL” refers to VHSIC (very high speed integrated circuit) hardware description language). VHDL is a programming language used to configure programmable logic devices, such as FPGA's and CPLD's. In addition, other programming languages, such as Verilog, could also be used as configuration information 360, although it is preferred that the programming language is stored in flash memory 317 in compiled form. In a preferred embodiment, the selected configuration information is used to configure reconfigurable controller 302 by utilizing the FPGA programming pins. For example, the FPGA programming pins for the Spartan 2-XILINX XC2S300E are configuration data input pins D0-D7. On
In step S406, based on the selection of functionality in step S404, system controller 304 obtains the selected program instructions 370 from flash memory 317 and loads the program instructions into program memory 385 of DSP 303. Program instructions are program codes executable by DSP 303 and are loaded from flash memory 317 to DSP 303 by system controller 304 via serial port interface (SPI) bus 380.
Finally, in step S407 camera control unit 6 proceeds with normal operation by running the operative program instructions (i.e. software) in DSP 303, and the configuration information (i.e. firmware) loaded to configurable controller 302 as described above with reference to
Referring back to step S405 and the configuration of reconfigurable controller 302, one portion of reconfigurable controller 302 that may become reconfigured in response to a new camera head is the control circuitry for the electrical interface. As discussed above, the signals generated and received by sensor clock generator 305, data formatter 306, and sensor timing block 312 provide the electrical interface between an attached camera head and camera control unit 6. Altering how the signals of the electrical interface are generated and manipulated is one aspect of reconfiguring reconfigurable controller 302 to adapt to a different head.
Referring again to
Sensor timing block 312 is reconfigured based on the type of camera head so as to generate signals tailored to the connected camera head from the clock and synchronization signals received from the camera head. Generally, sensor timing block 312 passes the pixel clock and synchronization signals generated by the sensor to data formatter 306, pixel preprocessing block 307, and memory 308. In addition, sensor timing block 312 generates line counters and pixel counters for bad pixel correct 501 (of
Additionally, sensor timing block 312 can be reconfigured to generate image size control signals, such as start/end line and start/end pixel control signals. These signals mark the start and end of a valid data area or window of the image sensor. In effect, these signals allow the camera control unit to discard camera data from defined areas of the image sensor and only process camera data from a desired area. For example, an image sensor for a snap-on camera may have a rectangular shape. However, since the sensor of a snap-on camera “looks through” the circular eyepiece of an optical endoscope, data around the edges may be unneeded if the sensor viewing area is larger than the eyepiece viewing area. Through the use of the start/end line and start/end pixel control signals, camera data from the edges can be discarded, thus improving processing speed. Similarly, the start/end line and start/end pixel control signals can be used to define an area for digital zoom.
A further aspect of the electrical interface that may become reconfigured is that of data formatter 306. As mentioned above, data formatter 306 typically receives 8 or 10 bit raw camera data from sensor 301 and arranges it into a 10 bit parallel format. However, since the logic circuitry of data formatter 306 can be reconfigured by system controller 304 using control bus 335, other data formats may be created. For instance, larger bit formats may be beneficial for higher resolution sensors. In addition, it may be necessary to reconfigure data formatter 306 to accept data streams of varying formats, such as parallel or serial data.
Reconfiguration is not limited to the electrical interface. Other aspects of reconfigurable controller 302 may also be reconfigured. For example, in pixel preprocessing block 307, different processing functions may be beneficial for different camera heads. As with the electrical interface control circuitry, system controller 304 may reconfigure pixel preprocessing block 307 in response to a new camera head attachment. As before, reconfiguration is achieved through the control bus 335. System controller 304 associates the received camera parameters with configuration information stored in flash memory 317 and reconfigures pixel preprocessing block 307 with the preprocessing functions used for the attached camera head. In addition, for certain preprocessing functions, rather than storing the necessary matrices and lookup tables in the flash memory, specific fields of the camera parameter data can be loaded into pixel preprocessing block 307, such as white balance reference matrix, color correct matrix, intensity correct lookup tables, and bad pixels index list.
Another block in reconfigurable controller 302 that may be reconfigured for a new camera head is memory 308. While for most sensor sizes, memory 308 is implemented as an asynchronous 16 bit FIFO, larger amounts of memory may be needed for larger sensors. In addition, for some programs stored in program memory 385 of DSP 303, the 16 bit FIFO registers may not have enough capacity. In this case, system controller 304 can reconfigure memory 308 to act as a controller for writing to a memory external to the reconfigurable controller, such as an SDRAM, indicated generally at 615 in
For retrieving the data from SDRAM 615, system controller 304 configures read address generator 613 to generate read addresses used by SDRAM controller 612. Using this address, SDRAM 612 retrieves the addressed data from SDRAM 615 and passes the camera data to output FIFO 614. The data is reformatted to 16 bits by padding before passing to HPI unit 310.
Referring again to
While all the reconfigurations mentioned above are performed in response to detecting a new camera and reading its camera parameters, camera identification can be obtained without necessarily reading it from a readable camera parameter storage device. For example, a new head can be identified through a calibration process. In addition, camera features such as sensor size may be inferred from synchronization signals.
The multiple output capability of the present invention allows the user to select from a plurality of video formats for the display of processed camera data. For example, the user could choose among NTSC, PAL, and RGB formats depending on the type of display being used. Referring to
Referring again to
With respect to configuration of reconfigurable controller 302, system controller 304 sends a signal to display formatter 311 via control bus 335. Referring to
Display formatter 311 receives the processed sensor data signal from memory 309 and converts the data format from 16 bits to 8 bits using decoder 802. The converted data is then sent to display multiplexer 803. Overlay unit 801 receives a signal containing overlay text data from system controller 304 via control bus 335 and receives a signal containing timing data from display timing block 804. Overlay unit 801 generates a contour for the overlay text data and cursor and sends the properly timed text data signal to display multiplexer 803. Display multiplexer 803 combines the processed sensor data signal and the text data signal, sending the combined signal to video encoder 313.
In an alternate arrangement, text data need not be combined with sensor data. For example, text may be displayed on its own screen, separate from the sensor data. This arrangement obviates the use of the overlay unit and the display multiplexer.
Referring again to
[Hardware Acceleration]
Referring again to
More specifically, one way in which reconfigurable controller 302 increases throughput is by acting as the interface between sensor 301 and DSP 303. As described above, reconfigurable controller 302 supplies timing to the sensor, formats the incoming data, performs any necessary pixel preprocessing on the data, and stores the incoming data in memory for use by the DSP. Conversely, reconfigurable controller 302 accesses data already processed by the DSP from a memory, formats the data for display, and delivers it to a video encoder.
Likewise, when DSP 303 has finished processing a certain number of camera data lines (preferably two lines of camera data), HPI controller 712 utilizes DMA to clock the processed camera data back into reconfigurable controller 302 from output memory 384. Decoder 713 then converts the 32 bit data to a 16 bit format and sends the processed data to memory 309 and then on to display formatter 311 (of
Another feature of the present invention lies in the use of a hardware clock signal from which other clock signals can be generated at clock frequencies adjustable by software. Referring back to
Reconfigurable controller 302 utilizes the DSP clock signals to generate clock signals for the sensor, display formatter, video encoder, and HPI unit. As discussed above with reference to sensor clock generator 305, the clock generators in reconfigurable controller 302 are typically implemented as dividers or controllable phase-locked-loop circuits. Sensor clock generator 305, HPI clock generator 315, TV clock generator 316, and the encoder clock are each configurable by system controller 304. By providing separate clock generators for each phase of the data flow, the camera control unit is able to alter input and output timing independently.
Altering input and output timing independently is beneficial since it is typically necessary to conform the output timing of a video system to the timing requirements of the output device (e.g. NTSC or PAL). However, the requirements for input timing are generally much more variable. Factors such as amount of motion in the image, intensity of light available, and type and size of sensor used can affect what input timing is optimal. By providing a software controllable clock for input timing, fine adjustments can be made. For example fine adjustments could be made to the sensor exposure timing instead of using gain amplifiers to improve the signal. This is desirable, as gain amplifiers generally introduce noise. Furthermore, since the output timing signal is independent of the input timing signal, fine adjustments to input timing will not produce timing anomalies in the output video signal.
According to another feature of the present invention, a user is able to select, such as by selection from a menu of plural options, the program instructions (i.e. software) and configuration information (i.e. firmware) used by the camera control unit. Using an input device, such as keyboard 10 or user interface 11 (both of
According to one preferred embodiment of this aspect of the invention, after system controller 304 has detected a new camera head, rather than reconfiguring the reconfigurable controller and loading program instructions into the DSP with predetermined firmware and software, the user may be prompted to select from among two or more sets of configuration information and program instructions, each directed to use for different medical applications or conditions. In some instances, an attached camera head may only have one set of configuration information and program instructions. In the case that there are multiple sets, a user may be given the option of selecting, for example, a profile for standard configuration information and program instructions settings, or a profile that performs better in low light conditions. Once the user selects the program instructions to be performed, system controller 304 accesses flash memory 317 to obtain the selected program instructions and then loads them into DSP 303 for execution as described with reference to the feature of multi-head adaptability. In addition, system controller 304 reconfigures reconfigurable controller 302 with the selected configuration information. As with the changes executed when a new head is detected, the selection of different configuration information and program instructions is not limited to changing simple numerical data points, such as gain factors and filter characteristics. Instead, a user selection of configuration information allows for significant hardware changes used for different applications, such as shifting the time base and digital zoom.
Another feature of the present invention is the ability to upgrade both the software (program instructions) and firmware (as configured by the configuration information) available to the camera control unit in the field. “In the field” refers to the time after which the camera control unit has been delivered to an end user, such as a hospital. Program instructions 370 and configuration information 360 utilized by camera control unit 6 are stored in flash memory 317. One way in which flash memory 317 can be updated is through a PC download. Referring again to
In combination with the feature of a reconfigurable controller and DSP, by allowing both the software and the firmware to be upgraded in the field, the present invention's camera control unit is able to adapt to new camera heads, pixel preprocessing algorithms, and image processing software without a change of hardware.
The feature of field-programmable software and firmware also contemplates the redistribution of functionality from their current assignments between reconfigurable controller 302 and DSP 303. As discussed above, input and output processing together with some preprocessing is handled by reconfigurable controller 302, while image processing is handled by DSP 303. Upgrading the software and firmware in the field may include shifting some of the image processing operations typically handled by DSP 303 to reconfigurable controller 302. Likewise, input and output processing and/or preprocessing operations could be shifted so as to be handled by DSP 303.
The invention has been described above with respect to particular illustrative embodiments. It is understood that the invention is not limited to the above-described embodiments and that various changes and modifications may be made by those skilled in the relevant art without departing from the spirit and scope of the invention. Thus, the present embodiments of the invention should be considered in all respects as illustrative and not restrictive, the scope of the invention to be determined by any claims supported by this specification, accompanying drawings, and the claims' equivalents rather than the foregoing description.
This application is a divisional application claiming priority to and the benefit of U.S. patent application Ser. No. 10/942,210, filed Sep. 15, 2004, entitled “ENDOSCOPY DEVICE SUPPORTING MULTIPLE INPUT DEVICES,” the content of which is hereby incorporated by reference as if recited in full herein for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4298260 | Takayama | Nov 1981 | A |
4403605 | Tanikawa | Sep 1983 | A |
4561427 | Takada | Dec 1985 | A |
4816909 | Kimura et al. | Mar 1989 | A |
4827908 | Matsuo | May 1989 | A |
4860094 | Hibino et al. | Aug 1989 | A |
4926258 | Sasaki et al. | May 1990 | A |
5040068 | Parulski et al. | Aug 1991 | A |
5124797 | Williams et al. | Jun 1992 | A |
5153721 | Eino et al. | Oct 1992 | A |
5162913 | Chatenever et al. | Nov 1992 | A |
5379069 | Tani | Jan 1995 | A |
5408265 | Sasaki | Apr 1995 | A |
5419312 | Arenberg et al. | May 1995 | A |
5475420 | Buchin | Dec 1995 | A |
5585840 | Watanabe et al. | Dec 1996 | A |
5598218 | Inoue | Jan 1997 | A |
5648817 | Aoki et al. | Jul 1997 | A |
5682199 | Lankford | Oct 1997 | A |
5696553 | D'Alfonso et al. | Dec 1997 | A |
5732704 | Thurston et al. | Mar 1998 | A |
5740801 | Branson | Apr 1998 | A |
5857463 | Thurston et al. | Jan 1999 | A |
5868666 | Okada et al. | Feb 1999 | A |
5896166 | D'Alfonso et al. | Apr 1999 | A |
5990950 | Addison | Nov 1999 | A |
6002425 | Yamanaka et al. | Dec 1999 | A |
6046769 | Ikeda et al. | Apr 2000 | A |
6154248 | Ozawa et al. | Nov 2000 | A |
6184922 | Saito et al. | Feb 2001 | B1 |
6201570 | Murata et al. | Mar 2001 | B1 |
6215517 | Takahashi et al. | Apr 2001 | B1 |
6224542 | Chang et al. | May 2001 | B1 |
6240312 | Alfano et al. | May 2001 | B1 |
6295082 | Dowdy | Sep 2001 | B1 |
6313868 | D'Alfonso et al. | Nov 2001 | B1 |
6414714 | Kurashige et al. | Jul 2002 | B1 |
6462770 | Cline et al. | Oct 2002 | B1 |
6503195 | Keller et al. | Jan 2003 | B1 |
6511422 | Chatenever | Jan 2003 | B1 |
6529640 | Utagawa | Mar 2003 | B1 |
6538687 | Saito et al. | Mar 2003 | B1 |
6540671 | Abe et al. | Apr 2003 | B1 |
6591130 | Shahidi | Jul 2003 | B2 |
6593962 | Downer et al. | Jul 2003 | B1 |
6625299 | Meisner et al. | Sep 2003 | B1 |
6638212 | Oshima | Oct 2003 | B1 |
6652451 | Murata et al. | Nov 2003 | B2 |
6669628 | Abe | Dec 2003 | B2 |
6677984 | Kobayashi et al. | Jan 2004 | B2 |
6692430 | Adler | Feb 2004 | B2 |
6704465 | Aoi et al. | Mar 2004 | B2 |
6707490 | Kido et al. | Mar 2004 | B1 |
6715068 | Abe | Mar 2004 | B1 |
6721003 | Tsuruoka et al. | Apr 2004 | B1 |
6738091 | Eouzan et al. | May 2004 | B1 |
7108687 | Furukawa | Sep 2006 | B2 |
7154527 | Goldstein et al. | Dec 2006 | B1 |
7212227 | Amling et al. | May 2007 | B2 |
7300397 | Adler et al. | Nov 2007 | B2 |
7355625 | Mochida et al. | Apr 2008 | B1 |
7358988 | Konishi et al. | Apr 2008 | B1 |
7453490 | Gunday | Nov 2008 | B2 |
7471310 | Amling et al. | Dec 2008 | B2 |
7559892 | Adler et al. | Jul 2009 | B2 |
7855727 | Adler et al. | Dec 2010 | B2 |
20010035902 | Iddan et al. | Nov 2001 | A1 |
20010051762 | Murata et al. | Dec 2001 | A1 |
20010051766 | Gazdzinski | Dec 2001 | A1 |
20010052930 | Adair et al. | Dec 2001 | A1 |
20020191082 | Fujino et al. | Dec 2002 | A1 |
20020196334 | Saito et al. | Dec 2002 | A1 |
20030025789 | Saito et al. | Feb 2003 | A1 |
20030030722 | Ozawa et al. | Feb 2003 | A1 |
20030123856 | Chatenever et al. | Jul 2003 | A1 |
20030142753 | Gunday | Jul 2003 | A1 |
20030174205 | Amling et al. | Sep 2003 | A1 |
20030184645 | Biegelsen et al. | Oct 2003 | A1 |
20030197790 | Bae | Oct 2003 | A1 |
20030236446 | Eino | Dec 2003 | A1 |
20040019255 | Sakiyama | Jan 2004 | A1 |
20040028390 | Chatenever et al. | Feb 2004 | A9 |
20040030367 | Yamaki et al. | Feb 2004 | A1 |
20050182299 | D'Amelio et al. | Aug 2005 | A1 |
20060055793 | Adler et al. | Mar 2006 | A1 |
20070041720 | Iketani | Feb 2007 | A1 |
20070276182 | Adler et al. | Nov 2007 | A1 |
Number | Date | Country |
---|---|---|
11 2005 002 223 | Aug 2007 | DE |
1311117 | May 2003 | EP |
1324596 | Jul 2003 | EP |
HEI 4-4504945 | Aug 1992 | JP |
HEI 7-240866 | Sep 1995 | JP |
HEI 10-19122 | Jan 1998 | JP |
11-004806 | Jan 1999 | JP |
2000-325350 | Nov 2000 | JP |
2001-145099 | May 2001 | JP |
2003-502891 | Jan 2003 | JP |
2004-000334 | Jan 2004 | JP |
2004-105533 | Apr 2004 | JP |
2004-209151 | Jul 2004 | JP |
2008-513058 | May 2008 | JP |
WO9110320 | Jul 1991 | WO |
WO9611548 | Apr 1996 | WO |
0077555 | Dec 2000 | WO |
WO 2006032013 | Mar 2006 | WO |
Entry |
---|
Advertizing Literature, “Olympus' Visera Endoeye vs. The Competition” (date unknown). |
Advertizing Literature, “Visera Camera Box” (date unknown). |
Office Action dated Dec. 19, 2008 from parent U.S. Appl. No. 10/942,210, now granted as U.S. Patent No. 7,855,727 issued on Dec. 21, 2010. (9 pages). |
Final Office Action dated Jun. 2, 2009 from parent U.S. Appl. No. 10/942,210, now granted as U.S. Patent No. 7,855,727 issued on Dec. 21, 2010.(10 pages). |
Office Action dated Dec. 9, 2009 from parent U.S. Appl. No. 10/942,210, now granted as U.S. Patent No. 7,855,727 issued on Dec. 21, 2010. (14 pages). |
Notice of Allowance dated Jul. 13, 2010 from parent U.S. Appl. No. 10/942,210, now granted as U.S. Patent No. 7,855,727 issued on Dec. 21, 2010. (8 pages). |
Japanese Office Action dated Jul. 27, 2010, for related Japan Patent Application No. 2007-531484; an informal English summary of the Office Action is included; 3 pages total. |
Japanese Office Action dated Jan. 5, 2010 for corresponding Japanese patent application No. 2007-531484 (with English Summary) 4 pages. |
PCT International Search Report and Written Opinion dated Apr. 20, 2006 for International Application No. PCT/US05/33101 filed Sep. 14, 2005, 9 page. |
Japanese Office Action dated Feb. 8, 2011, for related Japan Patent Application No. 2007-531484; an informal English summary of the Office Action is included; 2 pages total. |
Japanese Office Action dated Nov. 13, 2013 (2 page summary of the office action in addition to the office action in Japanese). |
“Olympus News Release: VISERA Video System Launched Globally” (Jun. 28, 2002) viewed at http://www.olympus.co.jp/en/news/2002a/nr020628viserae.cfm?ote=0&nr=1 on Sep. 13, 2004. |
German Office Action dated Jun. 30, 2011, for German Patent Application No. 11 2005 002 223.4; 7 pages. (English translation included; 6 pages). |
Advertizing Literature, “Olympus' Visera Endoeye vs. The Competition”, (Dec. 2003). |
Olympus Product Bulletin No. 399, “LTF-V3 VISERA Laparo-Thoraco Videoscope” (Oct. 3, 2002). |
Advertizing Literature, “Visera Camera Box”, (Jun. 2002). |
Office Action dated Dec. 19, 2008 from parent U.S. Appl. No. 10/942,210 now granted as U.S. Patent No. 7,855,727 issued on Dec. 21, 2010. (9 pages). |
Final Office Action dated Jun. 2, 2009 from parent U.S. Appl. No. 10/942,210 now granted as U.S. Patent No. 7,855,727 issued on Dec. 21, 2010. (10 pages). |
Office Action dated Dec. 9, 2009 from parent U.S. Appl. No. 10/942,210 now granted as U.S. Patent No. 7,855,727 issued on Dec. 21, 2010. (14 pages). |
Notice of Allowance dated Jul. 13, 2010 from parent U.S. Appl. No. 10/942,210 now granted as U.S. Patent No. 7,855,727 issued on Dec. 21, 2010. (8 pages). |
Number | Date | Country | |
---|---|---|---|
20110029733 A1 | Feb 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10942210 | Sep 2004 | US |
Child | 12902079 | US |