This patent document relates to systems and techniques for ophthalmic imaging. In more detail, the patent document relates to systems and methods for providing an electronically controlled fixation light for improving a precision of docking of an ophthalmic imaging system to a patient's eye.
A variety of advanced imaging devices have been developed over the years for ophthalmic imaging, diagnostics and surgery. For some applications, these imaging devices perform best when their optical axis is aligned with the optical axis of the imaged eye. Once the eye is brought into a position aligned with the optical axis of the imaging device, some devices enhance the precision of the imaging by keeping the eye essentially immobilized in this aligned position with a patient interface of an eye-docking system. The alignment of the optical axes is typically achieved by orienting the eye so that its optical axis is parallel to that of the imaging system and then docking the patient interface on the eye in a concentric manner. Therefore, as the precision of the imaging devices improves, the demand for eye-docking systems which provide more precise alignment also increases.
Achieving good alignment can be challenging, however, as without feedback and guidance systems the patient module often ends up docking to the eye in an off-center position with the eye's optical axis tilted relative to that of the imaging system.
In some systems, the operator of the imaging device can improve the alignment by adjusting the imaging system, the patient's eye, or both during the docking process. The operator can direct the docking iteratively by directing the patient verbally, manually orienting the eyeball, or adjusting portions of the imaging device, such as its objective or gantry. However, the inaccuracy of these approaches can make the docking process quite time consuming and frustrating.
In some systems, such as in some surgical systems using excimer lasers, the alignment is aided by a fixation light. The fixation light can be centered with the optical axis of the imaging system. The patient can be instructed to train his eye on the fixation light, aligning the patient's eye. However, even these fixation light systems have limitations.
This patent document discloses fixation light controller systems with improved functionalities. In some systems, the fixation light is simply centered with the optical axis of the imaging device. In such systems, in the typical case of the center of the imaged eye being off the optical axis of the imaging device, even if the patient looks at the fixation light, his or her eye will not be properly aligned with the optical axis of the device.
In some systems, including some YAG lasers and slit lamps, the fixation light is not fixed and thus can be manually adjusted. However, since the adjustment is only mechanical, typically it lacks precision. In addition, such mechanical adjustments can still be quite time consuming and frustrating because of their limited precision. The just described lack of precision of some systems can hinder the performance of these devices, including ophthalmic surgical, imaging and diagnostic systems.
The present patent document discloses fixation light controller systems that offer solutions for the above described problems. The disclosed examples and implementations can control a fixation light for an ophthalmic imaging system by non-mechanical control systems. For example, an ophthalmic system can include an ophthalmic imaging device that generates an image of a portion of an imaged eye, a fixation light controller, including an input module, configured to receive an input in relation to the image generated by the ophthalmic imaging device, and a control signal generator that generates a fixation light control signal in response to the received input, and a fixation light source, configured to receive the fixation light control signal and to generate a fixation light according to the received fixation light control signal.
In some implementations, where the ophthalmic imaging device is configured to generate the image essentially optically, the ophthalmic imaging device can include a microscope, an ophthalmic microscope, or a stereo microscope. In some implementations, where the ophthalmic imaging device is configured to generate the image at least in part electronically, the ophthalmic imaging device can include an electronic sensing system that senses a collected imaging light from the imaged eye, including at least one of Charge-Coupled Device (CCD) array, a Complementary Metal-Oxide Semiconductor (CMOS) array, a pixel-array, and an electronic sensor array. The ophthalmic imaging device can also include an electronic display system that displays the image of a portion of the imaged eye in relation to the sensed collected imaging light, including at least one of a Light Emitting Diode (LED) display, a plasma screen, an electronic display, a computer display, a Liquid Crystal Display (LCD) screen, a Cathode Ray Tube (CRT) display, a video-module, a video microscope display, a stereo video microscope display, a high definition (HD) video microscope, a processor-based image system, and an opto-mechanical projector. In some implementations, the ophthalmic imaging device can include an optical coherence tomographic (OCT) imaging system.
In some implementations, the ophthalmic imaging device can include an imaging module, configured to indicate a misalignment of the imaged eye and a reference-component of the ophthalmic imaging device. In some implementations, the reference-component of the imaging device can be an objective, a patient module, a docking tip, an interface, a contact lens, a pupil, a viewing frame, a reference frame, or an internal lens of the ophthalmic system. The imaging module can be configured to display a reference pattern related to the reference-component that can assist a system operator to estimate the misalignment of the imaged eye and the reference-component of the imaging device.
In some implementations, the ophthalmic imaging device can include an image-processor, configured to analyze the image of the portion of the imaged eye and the reference pattern, and to determine the misalignment of the imaged eye and the reference-component of the imaging device, and the image module is configured to display an indication of the misalignment, determined by the image-processor.
In some implementations, the input module is configured to receive an electronic, mechanical, optical, or sensed input. The input module can include a touch-pad, a touch-screen, a joystick, an electro-mechanical sensor, a position sensor, an optical sensor, a voice-prompted actuator, or an electro-mechanical controller. In some implementations, the fixation light source can include at least one of a LED array, a plasma screen, an electronic display, a computer display, an LCD screen, a video-module, an opto-mechanical projector, a CRT display, a slit-lamp, a processor-based image system, and a light-source movable by an electro-mechanical actuator.
In some implementations, the fixation light source is configured to display the fixation light for a non-imaged eye of the patient, and to move the displayed fixation light according to the received fixation light control signal to assist a reduction of a misalignment between the imaged eye and a reference-component of the ophthalmic system. In some implementations, the fixation light source is configured to generate the fixation light for the imaged eye, and to adjust the generated fixation light according to the received fixation light control signal to assist a reduction of a misalignment between the imaged eye and a reference-component of the ophthalmic system.
In some implementations, a method of aligning an eye with an ophthalmic system can include providing an imaging device and an electronically adjustable fixation light system, positioning a component of the imaging device and an imaged eye of a patient for generating an image of a portion of the imaged eye, imaging a portion of the imaged eye, determining a misalignment of the imaged eye relative to the imaging device based on the image, and controlling a fixation light of the fixation light system with an electronic control signal in accordance with the determined misalignment.
In some implementations, the providing the imaging device can include providing a microscope, an ophthalmic microscope, a stereo microscope, a video microscope, a Light Emitting Diode (LED) display, a plasma screen, an electronic display, a computer display, a Liquid Crystal Display (LCD) screen, a Cathode Ray Tube (CRT) display, a video-module, a video microscope display, a stereo video microscope display, a high definition (HD) video microscope, a processor-based image system, or an opto-mechanical projector. In some implementations, the providing the imaging device can include providing an optical coherence tomographic (OCT) system.
In some implementations, the positioning the component of the imaging device can include positioning at least one of an objective, a patient module, a docking tip, a contact lens, a pupil, a viewing frame, a reference frame, and an internal lens of the ophthalmic system in a spatial relation with a structure of the imaged eye suitable for imaging. In some implementations, the determining the misalignment can include determining at least one of a lateral misalignment and a rotational misalignment.
In some implementations, the determining the misalignment can include determining the misalignment with a passive assistance of the imaging device, the imaging device displaying an image of a portion of the imaged eye and a reference pattern. In some implementations, the determining the misalignment can include determining the misalignment with an active assistance of the imaging device, the imaging device displaying an image of a portion of the imaged eye, a reference pattern and a misalignment indicator.
In some implementations, the controlling the fixation light can include generating the electronic control signal with a fixation light controller, wherein the fixation light controller can include a touch-pad, a touch-screen, a joystick, an electro-mechanical sensor, a position sensor, an optical sensor, a voice-prompted actuator, or an electro-mechanical controller. In some implementations, the generating the electronic control signal can include generating the electronic control signal to cause a fixation light source to generate the fixation light to guide the patient to reduce the determined misalignment.
In some implementations, the fixation light source can be a LED array, a plasma screen, an electronic display, a computer display, an LCD display, a CRT display, a video-module, a slit-lamp, a processor-based image system, or a light-source movable by an electro-mechanical actuator. In some implementations, the generating the electronic control signal can include generating the electronic control signal for at least one of the imaged eye and a non-imaged eye. In some implementations, the determining the misalignment and the controlling the fixation light can be repeated iteratively.
In some implementations, a method of aligning an eye with an ophthalmic system can include imaging a portion of a procedure eye of a patient by an ophthalmic imaging device, displaying the image of the procedure eye by an imaging module, displaying a reference pattern in relation to the displayed image to indicate a misalignment of the imaged eye and a reference-element of the ophthalmic system, receiving a fixation light control command by a fixation light controller, and displaying a fixation light by a fixation light source in response to the fixation light control command to assist the patient to reduce the misalignment.
In some implementations, the receiving the fixation light control command can include receiving the fixation light control command through at least one of a touch-pad, a touch-screen, a joystick, an electro-mechanical sensor, a position sensor, an optical sensor, a voice-prompted actuator, and an electro-mechanical controller. In some implementations, the displaying the fixation light can include displaying the fixation light by at least one of a LED array, a plasma screen, an electronic display, a computer display, an LCD screen, a video-module, an opto-mechanical projector, a slit-lamp, a processor-based image system, and a light-source movable by an electro-mechanical actuator. In some implementations, the displaying the fixation light can include displaying the fixation light for one of the procedure eye or the non-procedure eye.
Implementations and embodiments in this patent document provide a fixation light system for ophthalmic imaging devices for increasing the precision of the alignment of the imaged eye and the imaging device.
However, verbal instructions can be unclear to an already disoriented patient, and manipulating the eye can be cumbersome and imprecise. Also, the patient is likely to undo or resist the actions of the surgeon or technician.
Some ophthalmic systems can utilize a fixation light to provide guidance for the patient. However, fixation light devices still have shortcomings, as discussed above. Some devices provide adjustable fixation lights as an improvement. However, even in such systems, the location of the fixation light is typically adjusted manually, still resulting in an adjustment process with limited precision.
In some implementations, the ophthalmic imaging device 110 can generate the image essentially optically. For example, the ophthalmic imaging device 110 can include a microscope, an ophthalmic microscope, or a stereo microscope. An imaging interface of these microscopes can include the eyepiece of these microscopes.
In some implementations, the ophthalmic imaging device 110 can generate the image at least in part electronically. For example, the ophthalmic imaging device 110 can include an electronic sensing system that senses the collected imaging light 113. The electronic sensing system can include a Charge-Coupled Device (CCD)-array, a Complementary Metal Oxide Semiconductor (CMOS) array, a pixel-array, or an electronic sensor array to sense the collected imaging light 113.
In these electronic imaging systems the imaging module 115 can include an electronic display system as an imaging interface. This electronic display can display an electronic image of a portion of the imaged eye 1i based on the sensed light 113. This electronic display or imaging interface can be, for example, a Light Emitting Diode (LED) display, a plasma screen, an electronic display, a computer display, a Liquid Crystal Display (LCD) screen, a Cathode Ray Tube (CRT) display, a video-module, a video microscope display, a stereo video microscope display, a High Definition (HD) video microscope, a processor-based image system, an opto-mechanical projector, or a light-source movable by an electro-mechanical actuator. In some implementations, the elements of the optical and the electronic imaging systems can be combined.
In some implementations, the ophthalmic imaging device can include an optical coherence tomographic (OCT) imaging system, as described in relation to
The reference-component of the imaging device 110 can be an objective, a patient module, a docking tip, an interface, a contact lens, a pupil, a viewing frame, a reference frame, an internal lens of the ophthalmic system, or any equivalents.
The location or display of the targeting pattern 117 can be fixed to the reference-component, in effect indicating the position of the reference-component. Therefore, the simultaneous display of the image portion of the imaged eye 1i and the targeting pattern 117 by the imaging module 115 can effectively assist the determination of the misalignment of the imaged eye 1i.
This assistance can be passive, the imaging module 115 only displaying the image portion of the imaged eye 1i and the reference pattern 117, so that a system operator can determine a degree of the misalignment of the imaged eye 1i and the reference-component of the ophthalmic system 100.
In some implementations, such as in electronic imaging modules 115, the imaging module 115 can actively assist the determination of the misalignment of the imaged eye 1i and the reference-component of the ophthalmic imaging system 100. Such active embodiments can include an image-processor that analyzes the image portion of the imaged eye 1i and the target pattern 117 and computes the misalignment. The image module 115 then can display an indication of the computed misalignment e.g. in the form of an arrow 233 (as shown in
In addition to the ophthalmic imaging device 110, the ophthalmic imaging system 100 can include the electronically controlled fixation light system 120. This electronically controlled fixation light system 120 can include a fixation light controller 130 and a fixation light source 140.
In response to the determined misalignment, the operator of the imaging system 100 can generate an input or command for the fixation light system 120 through the input module 135 of the fixation light controller 130. This input can represent a command regarding how the imaged eye 1i should be moved to reduce the misalignment, in a manner described below. In an example, if, from the image of the imaging module 115, the operator determined that the center of the imaged eye is 2 millimeters to the right of the center of the objective 112, then the operator can input a command through the input module 135 that will cause the patient to move the imaged eye 2 millimeters to the left to achieve an improved alignment.
The input module 135 can be an electronic, mechanical, optical, or sensed input module. For example, the input module 135 can be a touch-pad, a touch-screen, a joystick, an electro-mechanical sensor, a position sensor, an optical sensor, a voice-prompted actuator, or an electro-mechanical controller.
Once the command was entered into the input module 135, a control signal generator of the input module 135 can generate a fixation light control signal in response to the received command. A large variety of well-known electronic signal generators can be utilized for this function.
The fixation light source 140 can include a LED array, a plasma screen, an electronic display, a computer display, an LCD screen, a video-module, an opto-mechanical projector, a slit-lamp, a processor-based image system, or a light-source, movable by an electro-mechanical actuator.
Other embodiments may simply display the fixation light 145 on the fixation light source 140 at a location according to the fixation light control signal, instead of moving it. In either of these embodiments, the patient can be instructed to follow the fixation light 145 with the control eye 1c.
To facilitate procedures on both eyes, some embodiments may include two fixation light sources 140, one on each side of the objective 112.
The providing the imaging device 210a can include providing a microscope, an ophthalmic microscope, a stereo microscope, a video microscope, a Light Emitting Diode (LED) display, a plasma screen, an electronic display, a computer display, a Liquid Crystal Display (LCD) screen, a Cathode Ray Tube (CRT) display, a video-module, a video microscope display, a stereo video microscope display, a high definition HD video microscope, a processor-based image system, an opto-mechanical projector, or an optical coherence tomographic (OCT) system. In some of these imaging devices 110 the objective 112 can capture the collected imaging light 113 returned by the imaged eye 1i. The optic 114 can guide the collected imaging light 113 to the imaging module 115 and display it e.g. by the imaging interface of the imaging module 115.
The providing the electronically adjustable fixation light system 210b can include providing the fixation light controller 130 and the fixation light source 140.
The positioning 220 can include positioning at least one of the objective 112, the patient module, the docking tip, the contact lens, the pupil, the viewing frame, the reference frame, or an internal lens of the ophthalmic system to line up with a structure of the imaged eye 1i. The positioning 220 can also include moving the imaged eye 1i to a position suitable for imaging the imaged eye 1i. The positioning can also include moving both the objective 112 of the ophthalmic imaging device 100 and the imaged eye 1i to positions suitable for imaging the imaged eye 1i.
In some implementations, after the positioning 220 the imaged eye 1i and the imaging device 110 can be close but not yet in physical contact. In others, there can be a partial physical contact that still allows for a movement of the imaged eye 1i by either the patient of the surgeon.
The imaging a portion of the imaged eye 230 can include the surgeon imaging a portion of the imaged eye 1i with at least one of a microscope, an ophthalmic stereo microscope, a video microscope, a stereo video microscope, a high definition (HD) video microscope, or an optical coherence tomographic (OCT) system.
The determining the misalignment 240 can be performed by the operator of the ophthalmic imaging system 100, such as a surgeon. In such implementations, the imaging device 110 can assist the determining 240 passively by displaying an imaged portion of the imaged eye 1i and the reference or targeting pattern 117 simultaneously by the imaging interface of the imaging module 115.
In some implementations, the imaging device 110 can assist the determining 240 actively by displaying the imaged portion of the imaged eye 1i, the reference or targeting pattern 117, and a computed misalignment indicator 233 by the imaging interface of the imaging module 115.
The controlling the fixation light 250 can include generating an electronic control signal according to the determined misalignment. In some implementations, the electronic control signal can be generated by operating at least one of a touch-pad, a touch-screen, a joystick, an electro-mechanical sensor, a position sensor, an optical sensor, a voice-prompted actuator, or an electro-mechanical controller.
The controlling the fixation light 250 can also include generating the electric control signal to cause the fixation light source 140 to display the fixation light 145 to guide the patient to reduce the misalignment between the imaged eye 1i and the ophthalmic imaging system 110.
In response, the surgeon can decide that the fixation light 145 should be adjusted or moved to the lower-right direction by the fixation light source 140 to guide the patient to reduce and compensate this misalignment. Correspondingly, the surgeon can create a fixation light control command or input to represent the compensating adjustment of the fixation light 145. In this example, the surgeon can move his finger 9 on a touchpad 135 of the fixation light controller 130 in the lower-right direction. The input of this fixation light control command can lead to the generation of an electronic control signal by the fixation light controller 130 that causes the fixation light source 140 to move the fixation light 145 in the lower-right direction on an LCD screen. In other embodiments, other types of movement of the surgeon's finger can represent the necessary compensating adjustment, such as a movement in the upper-left direction.
In other examples, a possibly disposable patient interface 112-3 can be attached to the objective 112. The patient interface 112-3 can include a contact lens or applanation plate 112-4 and a vacuum skirt or suction seal 112-5. In these embodiments, the above system 100 and method 200 can be used for aligning either the contact lens 112-4 or the distal lens 112-2 with the imaged eye 1i.
A lateral misalignment can be compensated by the patient following the adjusted fixation light 145 to move the imaged eye 1i laterally by Δ, or in general by the misalignment vector (Δx,Δy). In other implementations, the lateral misalignment can be also compensated by the surgeon moving the objective 112 with a lateral adjustment Δ′, or in general by (Δ′x,Δ′y). In some cases, both the imaged eye 1i and the objective 112 can be adjusted to compensate the lateral misalignment together.
In yet other embodiments, a rotational misalignment can be reduced by the patient following the adjusted fixation light 145 causing the imaged eye to rotate by an angle α, or in general by the Euler angles (θ,φ).
Finally, in some cases both lateral and rotational misalignment can be present between the imaged eye 1i and the ophthalmic system 100. In such cases the surgeon may guide the compensation of the rotational misalignment by adjusting the fixation light 145 and by instructing the patient to follow the fixation light, while laterally moving the objective 112 to compensate the lateral misalignment.
As often the first fixation light control command will result in a reduction of the misalignment but not in its elimination, after the patient reacted to the adjusted fixation light 145, the surgeon can repeat the determining a residual misalignment 240 and the controlling the fixation light with the control signal 250 to further reduce the misalignment iteratively. This iteration can be continued until the misalignment has been compensated with a desired precision.
As before, the fixation light source 140 can include a LED array, a plasma screen, an electronic display, a computer display, an LCD screen, a video-module, a slit-lamp, a processor-based image system, or a light-source movable by an electro-mechanical actuator.
The method 300 of aligning the imaged eye 1i eye with the ophthalmic system 100 can include imaging a portion of a procedure eye of a patient by an ophthalmic imaging device—310; displaying the image of the procedure eye by an imaging module—320; displaying a reference pattern in relation to the displayed image to indicate a misalignment of the imaged eye and a reference-element of the ophthalmic system—330; receiving a fixation light control command by a fixation light controller—340; and displaying a fixation light by a fixation light source in response to the fixation light control command to assist the patient to reduce the misalignment—350.
The acts 310-330 have been described earlier in detail from the viewpoint of the operator of the ophthalmic system 100, such as the surgeon. The receiving the fixation light control command 340 can include receiving the fixation light control command through at least one of a touch-pad, a touch-screen, a joystick, an electro-mechanical sensor, a position sensor, an optical sensor, a voice-prompted actuator, or an electro-mechanical controller.
The displaying the fixation light 350 can include displaying the fixation light by at least one of a LED array, a plasma screen, an electronic display, a computer display, an LCD screen, a video-module, an opto-mechanical projector, a slit-lamp, a processor-based image system, or a light-source movable by an electro-mechanical actuator.
The displaying the fixation light 350 can include displaying the fixation light for one of the procedure eye or the non-procedure eye.
In addition, the elements 110-145′ can have functionalities related to the feature that in this implementation of the imaging system 100 the fixation light 145′ is not displayed via a separate fixation light display or source 140 for the control eye 1c. Instead, a fixation light controller 130′ can apply an electronic fixation light control signal to a fixation light source 140′ that projects a projected fixation light 145′ into the optical pathway of the imaging device 110. As such, the imaging device 110 and the fixation light system 120′ share some elements, as shown by the dotted lines. In some implementations, the projected fixation light 145′ can be coupled into the optic 114 that contains additional adjustable mirrors to adjust the optical path of the projected fixation light 145′. This coupling can take place between the optic 114 and the imaging module 115, or somewhere along the optic 114 e.g. by a beam splitter BS, as shown. In other embodiments, the projected fixation light 145′ can have a separate optical train or pathway to adjust its path and can be coupled into the optical pathway of the imaging device 110 just before the objective-projector 112′.
In addition, the elements 110″-145 can have functionalities related to the feature that the ophthalmic system 100″ can include a secondary imaging device 150. The secondary imaging device 150 can be, for example, an optical coherence tomographic (OCT) system. Numerous OCT imaging systems are known, including time-domain OCT systems and frequency domain OCT systems with a spectrometer or a swept source. A wide variety of these OCT systems can be used in the ophthalmic system 100″ to achieve various advantages. The imaging beam for the secondary imaging device 150 can be coupled into the main optical pathway via a beam splitter BS1.
Some implementations of the ophthalmic system 100″ can also include a procedure laser 160 for various ophthalmic surgical procedures. Further, some embodiments can include a patient interface 170 to provide firmer connection between the imaged eye 1i and the ophthalmic imaging device 110, for example with the application of vacuum suction. This patient interface 170 can be analogous to the patient interface 112-3 in
In some implementations of the ophthalmic system 100″ the imaging can be performed by the imaging module 115, in which case the system 100″ and its operation can be largely analogous to the earlier described embodiments.
In other implementations though, the secondary/OCT imaging system 150 can be used to image the imaged eye 1i. OCT imaging can be particularly useful to image a structure of the eye that is not visible for an ophthalmic microscope. An example is imaging the lens 5 of the eye. Because of its soft supporting system, the lens 5 is often not concentric with the visible structures of the eye such as the pupil 4. Further, as the weight of the objective 112 pressures the eye through the interface 170, the lens 5 can be additionally displaced and tilted. At the same time, aligning the ophthalmic system 100″ with the lens 5 instead of the pupil 4 or the limbus can be particularly important during cataract surgeries where the quality of the capsulotomy and other procedures can be improved by such an alignment.
While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what can be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features can be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination can be directed to a subcombination or variation of a subcombination.
Number | Name | Date | Kind |
---|---|---|---|
4164222 | Prokhorov et al. | Aug 1979 | A |
4198143 | Karasawa | Apr 1980 | A |
4235529 | Kawase et al. | Nov 1980 | A |
4465348 | Lang | Aug 1984 | A |
4520816 | Schachar et al. | Jun 1985 | A |
4533222 | Ishikawa | Aug 1985 | A |
4538608 | L'Esperance, Jr. | Sep 1985 | A |
4554917 | Tagnon | Nov 1985 | A |
4638801 | Daly et al. | Jan 1987 | A |
4764005 | Webb et al. | Aug 1988 | A |
4881808 | Bille et al. | Nov 1989 | A |
4901718 | Bille et al. | Feb 1990 | A |
4907586 | Bille et al. | Mar 1990 | A |
5048946 | Sklar et al. | Sep 1991 | A |
5049147 | Danon | Sep 1991 | A |
5054907 | Sklar et al. | Oct 1991 | A |
5098426 | Sklar et al. | Mar 1992 | A |
5112328 | Taboada et al. | May 1992 | A |
5139022 | Lempert | Aug 1992 | A |
5246435 | Bille et al. | Sep 1993 | A |
5255025 | Volk | Oct 1993 | A |
5286964 | Fountain | Feb 1994 | A |
5321501 | Swanson et al. | Jun 1994 | A |
5336215 | Hsueh et al. | Aug 1994 | A |
5391165 | Fountain et al. | Feb 1995 | A |
5439462 | Bille et al. | Aug 1995 | A |
5493109 | Wei et al. | Feb 1996 | A |
5512965 | Snook | Apr 1996 | A |
5549632 | Lai | Aug 1996 | A |
5656186 | Mourou et al. | Aug 1997 | A |
5738676 | Hammer et al. | Apr 1998 | A |
5779696 | Berry et al. | Jul 1998 | A |
5795295 | Hellmuth et al. | Aug 1998 | A |
5936706 | Takagi | Aug 1999 | A |
5954648 | Van Der Brug | Sep 1999 | A |
5954711 | Ozaki et al. | Sep 1999 | A |
5994690 | Kulkarni et al. | Nov 1999 | A |
6004314 | Wei et al. | Dec 1999 | A |
6045227 | Stewart et al. | Apr 2000 | A |
6095648 | Birngruber et al. | Aug 2000 | A |
6099522 | Knopp et al. | Aug 2000 | A |
6137585 | Hitzenberger et al. | Oct 2000 | A |
6254595 | Juhasz et al. | Jul 2001 | B1 |
6288784 | Hitzenberger et al. | Sep 2001 | B1 |
6314311 | Williams et al. | Nov 2001 | B1 |
6317616 | Glossop | Nov 2001 | B1 |
6337925 | Cohen et al. | Jan 2002 | B1 |
6377349 | Fercher | Apr 2002 | B1 |
6379005 | Williams et al. | Apr 2002 | B1 |
6451009 | Dasilva et al. | Sep 2002 | B1 |
6454761 | Freedman | Sep 2002 | B1 |
6497701 | Shimmick et al. | Dec 2002 | B2 |
6529758 | Shahidi | Mar 2003 | B2 |
6579282 | Bille et al. | Jun 2003 | B2 |
6623476 | Juhasz et al. | Sep 2003 | B2 |
6636696 | Saito | Oct 2003 | B2 |
6687010 | Horii et al. | Feb 2004 | B1 |
6730074 | Bille et al. | May 2004 | B2 |
6741359 | Wei et al. | May 2004 | B2 |
6751033 | Goldstein et al. | Jun 2004 | B2 |
6755819 | Waelti | Jun 2004 | B1 |
6763259 | Hauger et al. | Jul 2004 | B1 |
6769769 | Podoleanu et al. | Aug 2004 | B2 |
6775007 | Izatt et al. | Aug 2004 | B2 |
6863667 | Webb et al. | Mar 2005 | B2 |
6887232 | Bille | May 2005 | B2 |
6899707 | Scholler et al. | May 2005 | B2 |
6932807 | Tomita et al. | Aug 2005 | B1 |
6991629 | Juhasz et al. | Jan 2006 | B1 |
6996905 | Meguro | Feb 2006 | B2 |
7006232 | Rollins et al. | Feb 2006 | B2 |
7018376 | Webb et al. | Mar 2006 | B2 |
7027233 | Goldstein et al. | Apr 2006 | B2 |
7044602 | Chernyak | May 2006 | B2 |
7061622 | Rollins et al. | Jun 2006 | B2 |
7072047 | Westphal et al. | Jul 2006 | B2 |
7079254 | Kane et al. | Jul 2006 | B2 |
7102756 | Izatt et al. | Sep 2006 | B2 |
7113818 | Podoleanu et al. | Sep 2006 | B2 |
7126693 | Everett et al. | Oct 2006 | B2 |
7130054 | Ostrovsky et al. | Oct 2006 | B2 |
7133137 | Shimmick | Nov 2006 | B2 |
7139077 | Podoleanu et al. | Nov 2006 | B2 |
7145661 | Hitzenberger | Dec 2006 | B2 |
7148970 | de Boer | Dec 2006 | B2 |
7184148 | Alphonse | Feb 2007 | B2 |
7207983 | Hahn et al. | Apr 2007 | B2 |
7248371 | Chan et al. | Jul 2007 | B2 |
7268885 | Chan et al. | Sep 2007 | B2 |
7280221 | Wei | Oct 2007 | B2 |
7307733 | Chan et al. | Dec 2007 | B2 |
7310150 | Guillermo et al. | Dec 2007 | B2 |
7312876 | Chan et al. | Dec 2007 | B2 |
7319566 | Prince et al. | Jan 2008 | B2 |
7329002 | Nakanishi | Feb 2008 | B2 |
7330270 | O'Hara et al. | Feb 2008 | B2 |
7330273 | Podoleanu et al. | Feb 2008 | B2 |
7335223 | Obrebski | Feb 2008 | B2 |
7336366 | Choma et al. | Feb 2008 | B2 |
7342659 | Horn et al. | Mar 2008 | B2 |
7347548 | Huang et al. | Mar 2008 | B2 |
7352444 | Seams et al. | Apr 2008 | B1 |
7355716 | de Boer et al. | Apr 2008 | B2 |
7364296 | Miller et al. | Apr 2008 | B2 |
7365856 | Everett et al. | Apr 2008 | B2 |
7365859 | Yun et al. | Apr 2008 | B2 |
7370966 | Fukuma et al. | May 2008 | B2 |
7371230 | Webb et al. | May 2008 | B2 |
7372578 | Akiba et al. | May 2008 | B2 |
7377642 | Ishihara et al. | May 2008 | B2 |
7388672 | Zhou et al. | Jun 2008 | B2 |
7390089 | Loesel et al. | Jun 2008 | B2 |
7400410 | Baker et al. | Jul 2008 | B2 |
7402159 | Loesel et al. | Jul 2008 | B2 |
7426037 | Ostrovsky et al. | Sep 2008 | B2 |
7433046 | Everett et al. | Oct 2008 | B2 |
7452077 | Meyer et al. | Nov 2008 | B2 |
7452080 | Wiltberger et al. | Nov 2008 | B2 |
7461658 | Jones et al. | Dec 2008 | B2 |
7466423 | Podoleanu et al. | Dec 2008 | B2 |
7470025 | Iwanaga | Dec 2008 | B2 |
7477764 | Haisch | Jan 2009 | B2 |
7480058 | Zhao et al. | Jan 2009 | B2 |
7480059 | Zhou et al. | Jan 2009 | B2 |
7488070 | Hauger et al. | Feb 2009 | B2 |
7488930 | Ajgaonkar et al. | Feb 2009 | B2 |
7492466 | Chan et al. | Feb 2009 | B2 |
7503916 | Shimmick | Mar 2009 | B2 |
7508525 | Zhou et al. | Mar 2009 | B2 |
7535577 | Podoleanu et al. | May 2009 | B2 |
7537591 | Feige et al. | May 2009 | B2 |
7557928 | Ueno | Jul 2009 | B2 |
7575322 | Somani | Aug 2009 | B2 |
7593559 | Toth et al. | Sep 2009 | B2 |
7602500 | Izatt et al. | Oct 2009 | B2 |
7604351 | Fukuma et al. | Oct 2009 | B2 |
7614744 | Abe | Nov 2009 | B2 |
7630083 | de Boer et al. | Dec 2009 | B2 |
7631970 | Wei | Dec 2009 | B2 |
7633627 | Choma et al. | Dec 2009 | B2 |
7643152 | de Boer et al. | Jan 2010 | B2 |
7797119 | De Boer et al. | Sep 2010 | B2 |
7813644 | Chen et al. | Oct 2010 | B2 |
7898712 | Adams et al. | Mar 2011 | B2 |
8223143 | Dastmalchi et al. | Jul 2012 | B2 |
8394084 | Palankar et al. | Mar 2013 | B2 |
20010022648 | Lai | Sep 2001 | A1 |
20020013574 | Elbrecht et al. | Jan 2002 | A1 |
20020082466 | Han | Jun 2002 | A1 |
20020097374 | Payne et al. | Jul 2002 | A1 |
20020133145 | Gerlach et al. | Sep 2002 | A1 |
20020198516 | Knopp | Dec 2002 | A1 |
20030090674 | Zeylikovich et al. | May 2003 | A1 |
20030206272 | Cornsweet et al. | Nov 2003 | A1 |
20040039378 | Lin | Feb 2004 | A1 |
20040059321 | Knopp et al. | Mar 2004 | A1 |
20040151466 | Crossman-Bosworth et al. | Aug 2004 | A1 |
20040243233 | Phillips | Dec 2004 | A1 |
20050010109 | Faul | Jan 2005 | A1 |
20050015120 | Seibel et al. | Jan 2005 | A1 |
20050021011 | LaHaye | Jan 2005 | A1 |
20050173817 | Fauver et al. | Aug 2005 | A1 |
20050192562 | Loesel et al. | Sep 2005 | A1 |
20050201633 | Moon et al. | Sep 2005 | A1 |
20050203492 | Nguyen et al. | Sep 2005 | A1 |
20050215986 | Chernyak et al. | Sep 2005 | A1 |
20050284774 | Mordaunt | Dec 2005 | A1 |
20050286019 | Wiltberger et al. | Dec 2005 | A1 |
20050288745 | Andersen et al. | Dec 2005 | A1 |
20060020172 | Luerssen et al. | Jan 2006 | A1 |
20060077346 | Matsumoto | Apr 2006 | A1 |
20060100613 | McArdle et al. | May 2006 | A1 |
20060179992 | Kermani | Aug 2006 | A1 |
20060187462 | Srinivasan et al. | Aug 2006 | A1 |
20060195076 | Blumenkranz et al. | Aug 2006 | A1 |
20060206102 | Shimmick | Sep 2006 | A1 |
20070013867 | Ichikawa | Jan 2007 | A1 |
20070121069 | Andersen et al. | May 2007 | A1 |
20070126985 | Wiltberger et al. | Jun 2007 | A1 |
20070129709 | Andersen et al. | Jun 2007 | A1 |
20070129775 | Mordaunt et al. | Jun 2007 | A1 |
20070147730 | Wiltberger et al. | Jun 2007 | A1 |
20070173791 | Raksi | Jul 2007 | A1 |
20070173794 | Frey et al. | Jul 2007 | A1 |
20070173795 | Frey et al. | Jul 2007 | A1 |
20070185475 | Frey et al. | Aug 2007 | A1 |
20070189664 | Andersen et al. | Aug 2007 | A1 |
20070216909 | Everett et al. | Sep 2007 | A1 |
20070219541 | Kurtz | Sep 2007 | A1 |
20070230520 | Mordaunt et al. | Oct 2007 | A1 |
20070282313 | Huang et al. | Dec 2007 | A1 |
20070291277 | Everett et al. | Dec 2007 | A1 |
20070299429 | Amano | Dec 2007 | A1 |
20080033406 | Andersen et al. | Feb 2008 | A1 |
20080049188 | Wiltberger et al. | Feb 2008 | A1 |
20080055543 | Meyer et al. | Mar 2008 | A1 |
20080056610 | Kanda | Mar 2008 | A1 |
20080071254 | Lummis et al. | Mar 2008 | A1 |
20080088795 | Goldstein et al. | Apr 2008 | A1 |
20080100612 | Dastmalchi et al. | May 2008 | A1 |
20080281303 | Culbertson et al. | Nov 2008 | A1 |
20080281413 | Culbertson et al. | Nov 2008 | A1 |
20080319427 | Palanker | Dec 2008 | A1 |
20090012507 | Culbertson et al. | Jan 2009 | A1 |
20090088734 | Mordaunt | Apr 2009 | A1 |
20090125005 | Chernyak et al. | May 2009 | A1 |
20090131921 | Kurtz et al. | May 2009 | A1 |
20090149742 | Kato et al. | Jun 2009 | A1 |
20090157062 | Hauger et al. | Jun 2009 | A1 |
20090161827 | Gertner et al. | Jun 2009 | A1 |
20090163898 | Gertner et al. | Jun 2009 | A1 |
20090168017 | O'Hara et al. | Jul 2009 | A1 |
20090268161 | Hart et al. | Oct 2009 | A1 |
20100004641 | Frey et al. | Jan 2010 | A1 |
20100004643 | Frey et al. | Jan 2010 | A1 |
20100007848 | Murata | Jan 2010 | A1 |
20100022994 | Frey et al. | Jan 2010 | A1 |
20100022995 | Frey et al. | Jan 2010 | A1 |
20100022996 | Frey et al. | Jan 2010 | A1 |
20100042079 | Frey et al. | Feb 2010 | A1 |
20100110377 | Maloca et al. | May 2010 | A1 |
20100324543 | Kurtz et al. | Dec 2010 | A1 |
20110022036 | Frey et al. | Jan 2011 | A1 |
20110118609 | Goldshleger et al. | May 2011 | A1 |
20110304819 | Juhasz et al. | Dec 2011 | A1 |
20110319873 | Raksi et al. | Dec 2011 | A1 |
20120274903 | Sayeram et al. | Nov 2012 | A1 |
Number | Date | Country |
---|---|---|
1444946 | Aug 2004 | EP |
1803390 | Jul 2007 | EP |
1972266 | Sep 2008 | EP |
2002345758 | Dec 2002 | JP |
2009-112431 | May 2009 | JP |
9808048 | Feb 1998 | WO |
03062802 | Jul 2003 | WO |
2006074469 | Jul 2006 | WO |
WO2007106326 | Sep 2007 | WO |
2007130411 | Nov 2007 | WO |
Entry |
---|
Carl Zeiss Meditec, Inc. Cirrus HD-OCT User Manual. Dublin, CA, USA: Carl Zeiss Meditec, 2009. Print. |
Nidek Co. Ltd. Model Tonoref II Operator's Manual. Tokyo, Japan: Nidek, 2007. Print. |
Kim, Tae Hoon, Authorized Officer, Korean Intellectual Property Office, PCT International Application No. PCT/US2011/025332, in International Search Report, mailed Sep. 16, 2011, 8 pages. |
PCT International Search Report and Written Opinion dated Feb. 9, 2012 for International Application Serial No. PCT/US2011/040223. |
Kamensky et al.; “In situ monitoring of the middle IR laser ablation of a cataract-suffered human lens by optical coherent tomography”; Proc. SPIE; 2930: 222-229 (1996). |
Kamensky et al.; “Monitoring and animation of laser ablation process in cataracted eye lens using coherence IDS 41 tomography”; Proc. SPIE; 2981: 94-102 (1997). |
PCT International Search Report for International Application Serial No. PCT/US2011/023710 mailed Aug. 24, 2011. |
PCT International Search Report for International Application Serial No. PCT/US2010/056701 mailed Aug. 24, 2011. |
Swanson et al.; “In vivo retinal imaging by optical coherence tomography”; Optics Letters; vol. 18; No. 21; pp. 1864-1866 (Nov. 1993). |
RE 90/006,816, filed Feb. 27, 2007, Swanson et al. |
Arimoto et al., “Imaging Properties of Axicon in a Scanning Optical System,” Nov. 1, 1992, Applied Optics, 31(31): 6652-6657, 5 pages. |
Birngruber et al., “In-Vivo Imaging of the Development of Linear and Non-Linear Retinal Laser Effects Using Optical Coherence Tomography in Correlation with Histopathological Findings,” 1995, Proc. SPIE 2391:21-27, 7 pages. |
Chinn, S.R., et al., “Optical coherence tomography using a frequency-tunable optical source,” Optics Letters, 22(5):340-342, Mar. 1997. |
Fercher et al., “Eye-Length Measurement by Interferometry With Partially Coherent Light,” Mar. 1988, Optics Letters, 13(3):186-188, 3 pages. |
Fercher et al., “Measurement of Intraocular Distances by Backscattering Spectral Interferometry,” May 15, 1995, Optics Comm. 117:43-48, 6 pages. |
Huber, R., et al., “Three-dimensional and C-mode OCT imaging with a compact, frequency swept laser source at 1300 nm,” Optics Express, 13(26):10523-10538, Dec. 2005. |
International Search Report and Written Opinion dated Mar. 12, 2009 for International Application No. PCT/US2008/075511, filed Sep. 5, 2008 (9 pages). |
Izatt et al., Micron-Resolution Biomedical Imaging With Optical Coherence Tomography, Oct. 1993, Optics & Photonics News, pp. 14-19, 6 pages. |
Massow, O., et al., “Femtosecond laser microsurgery system controlled by optical coherence tomography,” Proceedings of the SPIE—Commercial and Biomedical Applications of Ultrafast Lasers VIII, vol. 6881, pp. 688106(1)-688106(6), Mar. 2008, 6 pages. |
Massow, O., et al., “Optical coherence tomography controlled femtosecond laser microsurgery system,” Proceedings of the SPIE—Optical Coherence Tomography and Coherence Techniques III, vol. 6627, pp. 662717(1)-662717(6), Aug. 2007. |
Ohmi, M., et al., “In-situ Observation of Tissue Laser Ablation Using Optical Coherence Tomography,” Optical and Quantum Electronics, 37(13-15):1175-1183, Dec. 2005, 9 pages. |
Sarunic, M., et al., “Imaging the Ocular Anterior Segment With Real-Time, Full-Range Fourier-Domain Optical Coherence Tomography,” Archives of Ophthalmology, 126(4):537-542, Apr. 2008, 6 pages. |
Sarunic, M., et al., “Instantaneous complex conjugate resolved spectral domain and swept-source OCT using 3×3 fiber couplers,” Optics Express, 13(3):957-967, Feb. 2005 11 pages. |
Stern et al., “Femtosecond Optical Ranging of Corneal Incision Depth,” Jan. 1989, Investigative Ophthalmology & Visual Science, 30(1):99-104, 6 pages. |
Swanson, et al., “Method and Apparatus for Optical Imaging with Means for Controlling the Longitudinal Range of the Sample,” U.S. Re-exam Patent Application No. 90/006,816, filed Oct. 20, 2003. |
Tao, Y., et al., “High-speed complex conjugate resolved retinal spectral domain optical coherence tomography using sinusoidal phase modulation,” Optics Letters, 32(20):2918-2920, Oct. 2007, 3 pages. |
Eurepean Patent Office, European Patent Application No. 10191057.8, in European Search Report, mailed Mar. 16, 2011, 3 pages. |
Wojtkowski et al., “In Vivo Human Retinal Imaging by Fourier Domain Optical Coherence Tomography,” Jul. 2002, Journal of Biomedical Optics 7(3):457-463, 7 pages. |
Yun, S.H., et al., “Wavelength-swept fiber laser with frequency shifted feedback and resonantly swept intra-cavity acoustooptic tunable filter,” IEEE Journal of Selected Topics in Quantum Electronics, 3(4):1087-1096, Aug. 1997. |
Korean Intellectual Property Office, PCT International Application No. PCT/US2011/023710, in International Search Report, mailed Aug. 24, 2011, 8 pages. |
Hee, M., et al., “Femotosecond Transillumination Optical Coherence Tomography,” Optics Letters, Jun. 1993, pp. 950-952, 18(12). |
PCT International Search Report and Written Opinion dated Apr. 10, 2012 for International Application No. PCT/US2011/051466 filed Sep. 13, 2011. |
Ostaszewski et al., “Risley prism Beam Pointer”, Proc. of SPIE, vol. 6304, 630406-1 thru 630406-10, 2006 [10 pages]. |
Sarunic, M., et al., “Real-time quadrature projection complex conjugate resolved Fourier domain optical coherence tomography”, Optics Letters, 31(16):2426-2428, Aug. 2006. |
Bagayev et al., “Optical coherence tomography for in situ monitoring of laser corneal ablation”, Journal of Biomedical Optics, 7(4), pp. 633-642 (Oct. 2002). |
Blaha et al., “The slit lamp and the laser in ophthalmology—a new laser slit lamp”, SPIE Optical Instrumentation for Biomedical Laser Applications, vol. 658, pp. 38-42, 1986. |
Boppart, S., et al., “Intraoperative Assessment of Microsurgery with Three-dimensional Optical Coherence Tomography”, Radiology, 208(1):81-86, Jul. 1998. |
Davidson, “Analytic Waveguide Solutions and the Coherence Probe Microscope”, Microelectronic Engineering, 13, pp. 523-526, 1991. |
Drexler, W., et al., “Measurement of the thickness of fundus layers by partial coherence tomography”, Optical Engineering, 34(3):701-710, Mar. 1995. |
Dyer, P., et al., “Optical Fibre Delivery and Tissue Ablation Studies using a Pulsed Hydrogen Fluoride Laser”, Lasers in Medical Science, 7:331-340, 1992. |
Fercher et al., “In Vivo Optical Coherence Tomography”, American Journal of Ophthalmology, 116(1), pp. 113-114, 1993. |
Fujimoto, J., et al., :Biomedical Imaging using Optical Coherent Tomography, 1994, 67. |
Hammer, D., “Ultrashort pulse laser induced bubble creation thresholds in ocular media”, SPIE, 2391:30-40, 1995. |
Hauger, C., et al., “High speed low coherence interferometer for optical coherence tomography”, Proceedings of SPIE, 4619:1-9, 2002. |
Hee, M., et al., “Optical Coherence tomography of the Human Retina”, Arch Ophthalmol, 113:325-332; Mar. 1995. |
Hitzenberger et al., “Interferometric Measurement of Corneal Thickness With Micrometer Precision”, American Journal of Ophthalmology, 118:468-476, Oct. 1994. |
Hitzenberger, C., et al., “Retinal layers located with a precision of 5 μm by partial coherence interferometry”, SPIE, 2393:176-181, 1995. |
Itoh et al., “Absolute measurements of 3-D shape using white-light interferometer”, SPIE Interferometry: Techniques and Analysis, 1755:24-28, 1992. |
Izatt et al., “Ophthalmic Diagnostics using Optical Coherence Tomography”, SPIE Ophthalmic Technologies, 1877:136-144, 1993. |
Izatt, J., et al., “Micrometer-Scale Resolution Imaging of the Anterior Eye In vivo With Optical Coherence Tomography”, Arch Ophthalmol, 112:1584-1589, Dec. 1994. |
Jean, B., et al., “Topography assisted photoablation”, SPIE, vol. 3591:202-208, 1999. |
Kamensky, V., et al., “In Situ Monitoring of Laser Modification Process in Human Cataractous Lens and Porcine Cornea Using Coherence Tomography”, Journal of biomedical Optics, 4(1), 137-143, Jan. 1999. |
Lee et al., “Profilometry with a coherence scanning microscope”, Applied Optics, 29(26), 3784-3788, Sep. 10, 1990. |
Lubatschowski, “The German Ministry of Research and education funded this OCT guided fs laser surgery in Sep. 2005”, http://www.laser-zentrum-hannover.de/download/pdf/taetigkeitsbericht2005.pdf. |
Massow, O., et al., “Femotosecond laser microsurgery system controlled by OCT”, Laser Zentrum Hannover e.V., The German Ministry of education and research,19 slides, 2007. |
Puliafito, Carmen, “Final technical Report: Air Force Grant #F49620-93-I-03337(1)” dated Feb. 12, 1997, 9 pages. |
Ren, Q., et al., “Axicon: A New Laser Beam Delivery System for Corneal Surgery”, IEEE Journal of Quantum Electronics, 26(12):2305-2308, Dec. 1990. |
Ren, Q., et al., “Cataract Surgery with a Mid-Infrared Endo-laser System”, SPIE Ophthalmic Technologies II, 1644:188-192, 1992. |
Thompson, K., et al., “Therapeutic and Diagnostic Application of Lasers in Ophthalmology”, Proceedings of the IEEE, 80(6):838-860, Jun. 1992. |
Thrane, L, et al., “Calculation of the maximum obtainable probing depth of optical coherence tomography in tissue”, Proceedings of SPIE, 3915:2-11, 2000. |
Wisweh, H., et al., “OCT controlled vocal fold femtosecond laser microsurgery”, Laser Zentrum Hannover e.V., The German Ministry of education and research, Grants: 13N8710 and 13N8712; 23 slides, 2008. |
Number | Date | Country | |
---|---|---|---|
20120069302 A1 | Mar 2012 | US |