This patent document relates to systems and techniques for ophthalmic imaging. In more detail, this patent document relates to systems and methods for providing an electronically controlled fixation light for improving a precision of aligning or docking an ophthalmic imaging system to a patient's eye.
A variety of advanced imaging devices have been developed over the years for ophthalmic imaging, diagnostics and surgery. For some applications, these imaging devices perform best when their optical axis is aligned with an optical axis of the imaged eye. Once the optical axis of the eye is aligned with the optical axis of the imaging device, some imaging devices enhance the precision of the imaging process by immobilizing the eye in the aligned position with the help of a patient interface or eye-docking system. As the precision of the imaging devices improves, the demand for eye-docking systems which provide more precise alignment also increases.
In typical existing systems the alignment is guided manually. The operator can direct the patient verbally, manually orient the eyeball, or adjust portions of the imaging device, such as its objective or gantry, or any combination of the above. These adjustments are performed iteratively during the docking process. However, the inaccuracy of these manual approaches can make the docking process quite time consuming and frustrating, and still fall short of achieving high quality alignment. Because of the limited precision of the manually guided alignment, the patient interface often ends up docked to the eye in an off-center position, the eye's optical axis tilted and the eye laterally misplaced relative to that of the imaging system.
Some imaging systems use guidance mechanisms that promise improvements for the alignment process. In some systems, such as in some surgical systems using excimer lasers, the alignment is aided by a fixation light. The fixation light can be centered with the optical axis of the imaging system. The patient can be instructed to train his eye on the fixation light. This fixation can align the patient's eye with the imaging system. However, even these fixation light systems have limitations.
This patent document discloses fixation light controller systems with improved functionalities. The eye of the patients typically has both lateral and angular misalignment relative to the imaging system. Simply looking at a fixed fixation light centered with the optical axis of the imaging device does not eliminate both types of misalignments.
Therefore, in some systems, including some YAG lasers and slit lamps, the fixation light is not fixed and can be manually or mechanically adjusted. However, since the adjustment is only mechanical/manual, the precision of these fixation lights is considerably less than the precision of the imaging systems. Further, such mechanical adjustments can be quite time consuming and frustrating because of their limited precision.
Finally, in some systems the fixation light may be controlled in part manually and in part electronically. In the hands of expert surgeons manual operations may improve the alignment, in other cases such system may still lack the required precision.
The present patent document discloses fixation light controller systems that offer solutions for the above described problems. In some implementations, an ophthalmic system may include an ophthalmic imaging device configured to generate an image of a portion of an imaged eye of a patient, an image processor, configured to determine a misalignment of the imaged eye and the imaging device by processing the generated image, and to generate a control signal according to the determined misalignment, and a misalignment-reduction system, configured to receive the control signal, and to generate a misalignment-reduction response.
In some implementations the ophthalmic imaging device can include an electronic sensing system that senses a collected imaging light from the imaged eye, including at least one of a Charge-Coupled Device (CCD) array, a Complementary Metal-Oxide Semiconductor (CMOS) array, a pixel-array, and an electronic sensor array, and an electronic display system that displays the image of a portion of the imaged eye in relation to the sensed collected imaging light, including at least one of a Light Emitting Diode (LED) display, a plasma screen, an electronic display, a computer display, a Liquid Crystal Display (LCD) screen, a Cathode Ray Tube (CRT) display, a video-module, a video microscope display, a stereo video microscope display, a high definition (HD) video microscope, a processor-based image system, an opto-mechanical projector of the electronic or digital type, and a light-source movable by an electro-mechanical actuator.
In some implementations the image processor is configured to identify an ophthalmic structure in the image, and to determine a measure of misalignment by determining a location of the ophthalmic structure relative to a reference of the imaging device. In some implementations the image processor is configured to identify the ophthalmic structure by determining a high-gradient line in the image, separating image elements with substantially different brightness or color.
In some implementations the image processor is configured to fit at least one of a circle and an ellipse to the high-gradient line by measuring radial distances between the high-gradient line and the circle or ellipse, to determine a location coordinate of the fitted circle or ellipse by minimizing a measure of the radial distances, and to determine a misalignment-measure by relating the determined location coordinate and a coordinate of the reference. In some implementations the image processor is configured to determine a high-contrast line in the image, to determine misalignment distances between the high-contrast line and a targeting pattern, and to determine a misalignment-measure from the misalignment distances.
In some implementations the reference of the imaging device is at least one of an objective, a patient module, a docking tip, an interface, a contact lens, a pupil, a viewing frame, a reference frame, and an internal lens of the ophthalmic system, and the imaging device is configured to generate a reference pattern related to the reference to assist the image processor to determine the misalignment of the imaged eye and the imaging device. In some implementations the recognized ophthalmic structure is a limbus of the imaged eye. In some implementations at least a portion of the image processed by the image processor is not displayed by the imaging device.
In some implementations the misalignment-reduction system can include a fixation light source, and the misalignment-reduction response comprises the fixation light source generating a fixation light in response to the received control signal. In some implementations the fixation light source is configured to generate the fixation light for a non-imaged eye of the patient, and to move the generated fixation light according to the received control signal to assist a reduction of a misalignment between the imaged eye and a reference-component of the ophthalmic system. In some implementations the fixation light source can include at least one of a LED array, a plasma screen, an electronic display, a computer display, an LCD screen, a video-module, an opto-mechanical projector of the electronic or digital type, a CRT display, a slit-lamp, a processor-based image system, and a light-source movable by an electro-mechanical actuator. In some implementations the fixation light source is configured to generate the fixation light to guide the patient to reduce an angular misalignment.
In some implementations the image processor is configured to determine the angular misalignment by fitting an ellipse to a high-contrast line of the image, and analyzing at least one of an aspect ratio and an area of the fitted ellipse. In some implementations the fixation light source can include a collimator to generate a fixation light to guide the patient to reduce a lateral misalignment.
In some implementations the misalignment-reduction system can include a gantry, configured to move a movable portion of the imaging device, and a gantry controller, configured to receive the control signal from the image processor, and to move the gantry according to the received control signal, and the misalignment-reduction response can include the gantry controller moving the gantry and thus the movable portion of the imaging device to reduce a lateral misalignment. In some implementations the gantry is also part of the ophthalmic imaging device. In some implementations the misalignment-reduction system can include a support-gantry, configured to move a patient support relative to the imaging device, and a gantry controller, configured to receive the control signal from the image processor, and to move the support-gantry according to the received control signal, and the misalignment-reduction response can include the gantry controller moving the support-gantry and thus the patient support to reduce a lateral misalignment.
In some implementations the image processor is configured to determine an angular and a lateral misalignment by processing the image, and the misalignment-reduction system can include only one of a fixation light source and a gantry controller.
In some implementations the image processor is configured to determine an angular and a lateral misalignment by processing the image, and the misalignment-reduction system can include a fixation light source, a gantry and a gantry controller. In some implementations, the image processor is configured to determine an angular and a lateral misalignment by processing the image and a misalignment information.
In some implementations the imaging system can include a locator light source, configured to project a locator light on the imaged eye, and the image processor is configured to identify an apical reflected locator light in the image generated by the imaging device, and to determine the misalignment information by analyzing the apical reflected locator light. In some implementations the misalignment information is at least one of an angular misalignment information, related to a vector in the image between the apical reflected locator light and a location of an imaged ophthalmic structure, and a lateral misalignment information, related to a vector in the image between a reference of the imaging system and at least one of the apical reflected locator light and the location of an imaged ophthalmic structure.
In some implementations the ophthalmic system is configured to reduce the angular misalignment by adjusting the fixation light source, and to reduce the lateral misalignment by operating the gantry controller. In some implementations the fixation light is adjustable so that the locator light and a location of an imaged ophthalmic structure can be aligned by adjusting the fixation light. In some implementations the fixation light source and the locator light source are capable of operating at different wavelengths. In some implementations the locator light is invisible for the imaged eye.
In some implementations a patient interface is configured to dock to the imaged eye of the patient after the misalignment-reduction system executed the misalignment-reduction response. In some implementations the misalignment-reduction system can include a fixation light source, configured to generate a fixation light for the imaged eye of the patient, and to adjust the generated fixation light according to the received control signal to assist a reduction of a misalignment between the imaged eye and a reference-component of the ophthalmic system. Some implementations include a locator light, focusable to a second focal point different from a first focal point of the fixation light.
In some implementations a method of aligning an eye with an ophthalmic system can include generating an image of a portion of an imaged eye of a patient by an ophthalmic imaging device, determining a misalignment of the imaged eye and the imaging device by an image processor processing the generated image, and generating a misalignment-reduction response electronically by a misalignment-reduction system based on the determined misalignment.
In some implementations the determining the misalignment can include identifying an ophthalmic structure in the image, and determining a location of the ophthalmic structure relative to a reference of the imaging device. In some implementations the generating the misalignment-reduction response can include generating a fixation light by a fixation light source according to the determined misalignment.
In some implementations the generating the fixation light can include generating the fixation light to guide the patient to reduce an angular misalignment. In some implementations the generating the fixation light can include generating a fixation light to guide the patient to reduce a lateral misalignment, wherein the fixation light source can include a collimator.
In some implementations the generating the fixation light can include generating the fixation light for a non-imaged eye of the patient, and the generating the misalignment-reduction response can include adjusting the fixation light according to the determined misalignment to assist the patient to reduce the misalignment. In some implementations the generating the fixation light can include generating the fixation light for the imaged eye of the patient, and the generating the misalignment-reduction response can include adjusting the fixation light according to the determined misalignment to assist the patient to reduce the misalignment.
In some implementations the generating the misalignment-reduction response can include moving a gantry of the imaging system by a gantry controller to reduce a lateral misalignment.
In some implementations the determining the misalignment can include determining an angular and a lateral misalignment by the image processor processing the image and a misalignment information, and the generating the misalignment-reduction response can include adjusting a fixation light of a fixation light system and a gantry controller. In some implementations the determining the misalignment can include projecting a locator light onto the imaged eye by a locator light system, locating an apical reflected locator light in the image generated by the imaging device, and determining the misalignment information using the located apical reflected locator light. In some implementations the determining the misalignment information can include determining an angular misalignment information, related to a vector in the image between the apical reflected locator light and a location of an imaged ophthalmic structure, and determining a lateral misalignment information, related to a vector in the image between a reference of the imaging system and at least one of the apical reflected locator light and the imaged ophthalmic structure.
In some implementations the generating the misalignment-reduction response can include reducing the angular misalignment by adjusting the fixation light, and reducing the lateral misalignment by operating the gantry controller. In some implementations the reducing the angular misalignment and the reducing the lateral misalignment are repeated iteratively. In some implementations the generating the misalignment-reduction response can include projecting the fixation light into the imaged eye, and reducing the lateral and the angular misalignment by causing the head of the patient to move laterally to align the locator light and the fixation light.
In some implementations an ophthalmic system can include an imaging device that generates an image of an imaged eye of a patient, an image processor that determines an angular and a lateral misalignment of the imaged eye and the imaging device by processing the generated image, a fixation light system that projects a fixation light on an eye of the patient to assist a reduction of the angular misalignment, and a gantry that adjusts a movable optic of the system to reduce the lateral misalignment. In some implementations the ophthalmic system can include an indicator light system that projects an indicator light on the imaged eye to provide a misalignment information for the image processor.
Implementations and embodiments in this patent document provide a fixation light system for ophthalmic imaging devices for increasing the precision of the alignment of the imaged eye and the imaging device.
However, verbal instructions can be unclear to an already disoriented patient, and manipulating the eye can be cumbersome and imprecise. Also, the patient is likely to undo or resist the actions of the surgeon or technician.
Some ophthalmic systems can utilize a fixation light to provide guidance for the patient. However, fixation light devices still have shortcomings as discussed above. Some devices provide adjustable fixation lights as an improvement. However, even in such systems, the location of the fixation light is typically adjusted manually or mechanically, still resulting in an adjustment process with limited precision.
The ophthalmic imaging device 110 can include an imaging light source 111 that provides an imaging light for the imaged eye 1i. The imaging light source 111 can be a single light, a ring of e.g. 4, 6 or 8 lights, or a light source with a continuous ring shape. An objective 112 can collect a fraction of the imaging light, returned by the imaged eye 1i, and direct it as a collected imaging light 113 to an optic 114. The optic 114 can guide the collected imaging light 113 towards an imaging module 115. In general, the optic 114 can be quite complex, including a large number of lenses and mirrors. The optic 114 can also be multifunctional, for example also configured to guide a surgical laser beam to the imaged eye 1i. The imaging module 115 can provide an image for an operator of the imaging system 100 via an imaging interface.
In some implementations, the ophthalmic imaging device 110 can include a microscope, an ophthalmic microscope, or a stereo microscope. An imaging interface of these microscopes can include the eyepiece of these microscopes.
In some implementations, the ophthalmic imaging device 110 can generate the image at least in part electronically. For example, the imaging module 115 of the ophthalmic imaging device 110 can include an electronic sensing system that senses the collected imaging light 113. The electronic sensing system can include a Charge-Coupled Device (CCD)-array, a Complementary Metal Oxide Semiconductor (CMOS) array, a pixel-array, or an electronic sensor array to sense the collected imaging light 113.
In these electronic imaging systems the imaging module 115 can also include an electronic display system as an imaging interface. This electronic display can display an electronic image of a portion of the imaged eye 1i based on the sensed light 113. This electronic display or imaging interface can be, for example, a Light Emitting Diode (LED), an organic LED (OLED) display, an active matrix OLED (AMOLED) display, a plasma screen, an electronic display, a computer display, a Liquid Crystal Display (LCD) screen, a Cathode Ray Tube (CRT) display, a video-module, a video microscope display, a stereo video microscope display, a High Definition (HD) video microscope, a processor-based image system, an opto-mechanical projector of the electronic or digital type, or a light-source movable by an electro-mechanical actuator. In some implementations, the above elements of the imaging systems can be combined.
In some implementations, the ophthalmic imaging device 110 can include an optical coherence tomographic (OCT) imaging system, as described in relation to
In some implementations, the misalignment reduction system 130 may include the objective 112, in others portions of the optic 114.
The image processor 120 can be configured to identify an ophthalmic structure in the image, generated by the imaging device 110, and to determine a location of the ophthalmic structure relative to a reference of the imaging device. The reference of the imaging device can be the objective 112, a patient module, a docking tip, an interface, a contact lens, a pupil, a viewing frame, a reference frame, and an internal lens of the ophthalmic system. The imaging module 115 can be configured to generate a reference pattern related to the reference to assist the image processor to determine the misalignment of the imaged eye and the imaging device. A targeting circle similar to the targeting pattern 17 can be such a reference pattern. Other reference patterns may include cross hairs, multiple circles and their combinations.
The image processor 120 may be configured to recognize the limbus 6 as the ophthalmic structure. The image processing may be based on the pupil 4 as well, but often the limbus 6 forms a more regular circle and thus is well suited for the image processing.
Next, the image processor 120 can identify a high-gradient or high-contrast pixel 121 along the scan as the pixel where the recorded brightness or color varies the fastest. A high-contrast or high-gradient line 122 can be defined by connecting the high-gradient/contrast pixels of nearby scans. Such a high-gradient/contrast line can separate ophthalmic regions with strongly differing brightness or color and thus can be a useful indicator of ophthalmic structures, such as the limbus 6 or the pupil 4. Numerous other methods of machine-vision and image processing are known in the arts to determine structures and their boundaries, which can be used in place of the above described high-gradient/contrast method.
Δ=[(Δ12+Δ22+Δ32+Δ42)/4]1/2
In a typical case, the image processor 120 may be able to fit a fitting circle 124 to the high-gradient/contrast line 122 with or without adjusting its radius and thus conclude that the ophthalmic structure indicated by the high contract line 122 is circular. Next, the image processor 120 may determine that the color of the pixels changes from white to non-white across the high-gradient/contrast line 122. These findings can be sufficient for the image processor 120 to conclude that it identified the circular limbus 6 of the imaged eye 1i.
During this fitting process the image processor 120 determines the coordinates of the center of the limbus 6, since the limbus 6 is concentric with the shifted fitting circle 124′ and thus the center of the limbus 6 is located at the same (Cx,Cy) coordinates as the center of the shifted fitting circle 124′. Therefore, the image processor 120 can determine a misalignment vector 143 that connects the (Cx,Cy) coordinates of the center of the limbus 6 to the known center coordinates of a targeting pattern 117. The misalignment vector 143 may be used by a misalignment reduction system 130 to reduce the misalignment of the imaged eye 1i with the ophthalmic system 100 as described below.
The reference-component of the imaging device 110 can be the objective 112, a patient module, a docking tip, an interface, a contact lens, a pupil, a viewing frame, a reference frame, an internal lens of the ophthalmic system, or any equivalents.
The location or display of the targeting pattern 117 can be fixed to the reference-component, in effect indicating the position of the reference-component. Therefore, the simultaneous display of the image portion of the imaged eye 1i and the targeting pattern 117 by the imaging module 115 can effectively assist the determination of the misalignment of the imaged eye 1i.
The image processor 120 can analyze the simultaneously displayed image portion of the imaged eye 1i and the target pattern 117 and compute the misalignment. The details of computing the misalignment were described above extensively. The image processor 120 can summarize the computed direction and magnitude of the misalignment by generating the misalignment vector 143. Based on this misalignment vector 143, the image processor 120 can compute a misalignment reduction vector 144 to be used by the misalignment reduction system 130 to reduce or eliminate the computed misalignment. In general, the misalignment reduction vector 144 need not be the same or simply opposite as the misalignment vector 143, as it represents how the misalignment reduction system is to be adjusted to reduce or eliminate the misalignment. As such, the misalignment reduction vector 144 also depends on the distance of the misalignment reduction system 130 from the eye 1 and on other factors and thus can refer to a large variety of misalignment reduction measures.
Next, the image processor 120 can generate a fixation light control signal for the fixation light source 140 according to the determined misalignment reduction vector 144.
In some implementations, the image of the eye portion and the targeting pattern 117 are not necessarily displayed. Rather, they can be provided for the image processor 120 by the imaging device 110 in an electronic form only, invisible for the system operator or surgeon.
Some image processors 120 do not utilize the fitting circle 124 of
The search algorithm can be based e.g. on minimizing a misalignment-measure, such as the average misalignment Δ above, or on symmetrizing the misalignment distances Δ*1 . . . Δ*n in opposing directions, among others. After the search, the misalignment vector 143 can be determined to characterize the misalignment. The image processor 120 can then compute the misalignment reduction vector 144 based on the determined misalignment vector 143 and output a fixation light control signal towards the fixation light source 140 corresponding to the misalignment reduction vector 144.
The fixation light source 140 can first generate and display the fixation light 145, and then move the displayed fixation light 145 according to the received fixation light control signal. Since the movements of the control eye 1c and the imaged eye 1i closely track each other, as the control eye 1c is moved by the patient according to the displayed fixation light 145, the imaged eye 1i moves in a correlated manner. Because of this correlation between the movements of the imaged eye 1i and the control eye 1c, the fixation light system 120 can assist the reduction of the misalignment of the imaged eye 1i relative to the ophthalmic imaging system 110.
Other embodiments may simply display the fixation light 145 by the fixation light source 140 at a properly chosen location according to the fixation light control signal, instead of moving it. In either of these embodiments, the patient can be instructed to follow, or focus on, the fixation light 145 with the control eye 1c.
The fixation light source 140 can include a LED array, a plasma screen, an electronic display, a computer display, an LCD screen, a video-module, an opto-mechanical projector, a slit-lamp, a processor-based image system, or a light-source, movable by an electro-mechanical actuator.
To facilitate procedures on both eyes, some embodiments may include two fixation light sources 140, one on each side of the objective 112.
In some implementations, the image processor 120 may display the processed image e.g. for informing the medical technician or surgeon. In other implementations at least a portion of the image processed by the image processor 120 may not be displayed by the imaging system 100, only provided in electronic format to the image processor 120 by the imaging device 110.
In operation, the image processor 120 can determine a lateral misalignment Δ of the imaged eye from the analysis of the image of the imaged eye 1i, and compute a corresponding misalignment reduction vector 144fl, the label l referring to the lateral misalignment in this fixation light system, referred to by the label f. The image processor 120 then can generate a fixation light control signal representing the calculated misalignment reduction vector 144fl to be sent to the fixation light source 140. Upon receiving the fixation light control signal, the fixation light source 140 can move or adjust the collimated fixation light 145 with the misalignment reduction vector 144fl, shown by the solid arrow. The patient 8 can be instructed to move his/her head to find the adjusted collimated fixation light 145. In order to actually see the collimated fixation light 145, the patient 8 will have to move his/her head laterally until the lateral misalignment Δ is essentially eliminated.
Some aspects of these gantry-based systems differ from those of the fixation light systems of
In practice, an ophthalmic surgeon often faces a combination of the above angular and lateral misalignments. Advanced single-component implementations of the misalignment-reduction system 130 may be able to reduce or eliminate both of these misalignments, as described next.
For example, in a misalignment-reduction system 130 with a fixation light source 140 component only, in a first phase the image processor 120 may follow the method of
In an implementation the image processor 120 can project the fixation light 145 at a suitable first location and the patient can be instructed to focus on this once-adjusted fixation light 145. From measuring the ellipticity of the limbus 6i, the knowledge of first location and the location of the eye on the imaging interface 115, the image processor 120 can determine the lateral and angular misalignments. Based on the determined lateral misalignment, the patient can be instructed to move the imaged eye 1i to the center of the imaging device 110. This process may be performed iteratively to reach sufficient precision. Sometimes the fixation light 145 can be readjusted and the ellipticity re-measured to assist the process.
After the eye is centered with sufficient precision, the image processor 120 may adjust the fixation light 145 for a second time, typically to a second location corresponding to the center of the imaging device 110. The patient 8 focusing on this twice adjusted fixation light 145 can eliminate the angular misalignment as well.
The apparent ellipticity of the limbus 6i may have a third cause as well besides the two types of misalignments: often the limbus 6i itself is not entirely circular. In some implementations, the image processor 120 may need to perform an advanced image processing algorithm to separate the three causes of the ellipticity. The advanced image processing may include tracking suitably chosen merit functions or the analysis of optical distortions of the image. An example of the merit function can be the area of the fitted ellipse.
Similarly, the single-component gantry-based misalignment-reduction system 130 may be able to correct both types of misalignments in separate phases as well.
If the above described two-phase methods only reduced the two misalignments but did not eliminate them, the two phases can be repeated iteratively to substantially eliminate the two types of misalignments. A large variety of optimization and other search algorithms can be used to facilitate such iterative approaches.
Aspects of this implementation include that the relative position of the optical elements in the imaging device 110 are not changed during regular operations, thus a high level of alignment and precision of the optics can be maintained. At the same time, the weight and physical extent of the patient support 168 is much greater than that of the objective 112, thus the high precision adjustment of the patient support 168 has its own challenges.
The two phases of alignment reduction can be performed in the opposite order or in alternating repeated phases. Referring to
The misalignment information can be originated by a locator light source 170. The locator light source 170 can generate a locator light 175 which can be coupled into the main optical pathway by a beam splitter 171. The optic 114 and in particular the objective 112 can guide or project the locator light 175 onto the imaged eye 1i.
If the imaged eye 1i can be approximated by a reflecting sphere, or at least a portion of a reflecting sphere, then standard geometric considerations reveal that the portion of the locator light 175 that reflects back into the objective 112 parallel to the optical axis 28 is the one that is reflected from the apex of the spherical eye 1. This reflected light will be referred to as an apical reflected locator light 177. The other rays are shown to reflect away from the system optical axis 28.
For a spherical imaged eye 1i having a lateral misalignment Δ relative to the system optical axis 28, the white spot image of the apical reflected locator light 177i does not coincide with the system optical axis, indicated by the solid cross. It is noted though that the relative locations of the white spot and the black cross are independent from a possible angular misalignment of the image eye. Thus, for spherical eyes the vector connecting the imaged apical reflected locator light 177i with the cross-mark of the system optical axis 28 can provide the additional alignment information for the image processor 120 that enables it to determine the lateral misalignment independently from the angular misalignment.
Therefore, the vector or distance connecting the image spot of the apical reflected locator light 177i and the limbus center 6ic is an example of a dominantly or purely angular misalignment information that can be used by the image processor 120 to generate a misalignment reduction vector 144fa for the fixation light source 140 to correct this angular misalignment.
On the other hand, determining the lateral displacement Δ, e.g. between the system optical axis 28 and the center 1x of the imaged eye 1i, may be more challenging when the complex shape of the eye is taken into account than the procedure in
In a subsequent second phase, the distance or vector between the system optical axis 28, indicated by the solid cross, and the overlapping image spot of the apical reflected locator light 177i and the limbus center 6ic (solid x) can provide a lateral misalignment information. The image processor 120 may compute the lateral misalignment reduction vector 144gl using this lateral misalignment information and send a corresponding control signal to the gantry controller 150. In response, the gantry controller 150 can adjust the gantry 155 with the lateral misalignment reduction vector 144gl.
Numerous equivalent implementations of the above principles can be practiced as well, for example performing the first and second phases in repeated iterative steps or in reverse order.
Once both types of misalignments have been reduced or eliminated by the misalignment-reduction system 130, the operator of the ophthalmic system 100 may lower a patient interface 180, configured to dock to the imaged eye 1i of the patient. This patient interface 180 can immobilize the imaged eye 1i to keep it fixed for subsequent procedures. These procedures may include diagnostic procedures, imaging procedures and ophthalmic surgical procedures.
In detail, the objective 112 of the ophthalmic system 100 can include a distal objective lens 112-1, contained in an objective housing 112-2. The patient interface 180 can include an interface lens, contact lens, sometimes also called applanation plate 180-1, contained in an interface housing 180-2. The patient interface 180 may be attached to the objective 112 or to the distal end of the imaging system 110. In other embodiments, part of the patient interface 180 can be attachable to the eye and the other part to the distal end of the imaging system 110. The patient interface 180 can be attachable to the eye with a suction ring or vacuum skirt 180-3.
In these architectures, the patient interface 180 can be docked with the imaged eye 1i after the alignment of the imaged eye 1i with the imaging device 110 has been completed. In other embodiments, the patient interface 180 can be docked with the imaged eye 1i in an iterative manner. First, the imaged eye 1i can be brought into alignment with the imaging device 110. Second, the patient interface can be lowered onto the imaged eye 1i to make contact, but still allowing the imaged eye 1i some movement. But since during the first phase the imaged eye 1i may have moved, or the image processor 120 may not have determined the alignment perfectly, in a third phase the alignment procedure can be repeated and one or more new misalignment reduction vectors can be computed by the image processor 120. Fourth, the imaged eye 1i can be realigned using the newly computed misalignment reduction vector(s). These partial or stepwise phases can be followed by the full strength docking of the patient interface 180 onto the imaged eye 1i, preventing further relative movement of the imaging device 110 and the imaged eye 1i.
In some cases the first focal point 146 can be fixed to lie on the system optical axis 28. In these implementations, (i) the image processor 120 can identify the lateral and angular misalignments of the imaged eye 1i by processing the image of the eye 1i; (ii) the image processor 120 can present or project the second fixation light 175 with a suitably located initial focal point 176, and (iii) the image processor 120 can move or adjust the second fixation light 175 towards the system optical axis 28 to guide the patient 8 to align the imaged eye optical axis 9i with the system optical axis 28. In
In another implementation, the second fixation light 175 and its focal point 176 can be fixed on the system optical axis 28 and the first focal point 146 can be adjusted by the image processor 120 to guide the patient 8 to align the imaged eye optical axis 9i with the system optical axis 28. In
Some embodiments may assist the alignment process in these collimator implementations with providing the locator light 175, focused at the second focal point 176. Since the locator light 175 is not collimated, the patient 8 is able to see the second focal point 176 even from misaligned positions. In these embodiments, after the patient 8 fixates on the locator light 175, the image processor 120 can subsequently move or adjust the locator light 175 (shown by the solid arrow) to assist the patient to rotate and move the imaged eye until the patient sees the collimated fixation light 145.
Some of these ophthalmic systems 100 may also include a secondary imaging system 195. This secondary imaging system 195 can include an optical coherence tomographic (OCT) system. OCT systems, especially the spectrometer based frequency-domain type, are well suited to image three dimensional ophthalmic target regions, as they are capable of acquiring image data from all depth of the target region simultaneously. The beams of the procedure laser 190 and the secondary imaging system 195 can be coupled into the main optical pathway by beam splitters BS1 and BS2, respectively. Such systems may combine the z-directional imaging functionality of the OCT imaging system 195 with the above described image processing-based alignment procedure to achieve alignment both with visible ophthalmic structures as well as with targets inside the eye.
The generating an image 210 can include generating an image 212 of a portion of the imaged eye 1i with the imaging device 110.
The determining the misalignment 220 can include (1) identifying an ophthalmic structure 222 in the image 212. The ophthalmic structure can be the pupil 4, the lens 5, and the limbus 6, among others. The determining 220 can also include (2) determining the misalignment by determining a location of the ophthalmic structure 222 relative to a reference of the imaging device by the image processor 120. The reference of the imaging device can be the objective 112, a patient module, a docking tip, an interface, a contact lens, a pupil, a viewing frame, a reference frame, an internal lens of the ophthalmic system, or a reference pattern 117 generated by the imaging device 110. The misalignment can be a lateral or an angular misalignment, determined by the image processor 120 by analyzing the image using software implementations. Finally, (3) the image processor 120 can generate a control signal according to the determined misalignment and output the generated control signal to the misalignment-reduction system 130.
The generating the misalignment-reduction response 230 can include generating the misalignment-reduction response 230 by the misalignment reduction system 130. In some embodiments, the generating the misalignment-reduction response 230 can include generating the fixation light 145 by the fixation light source 140 according to the misalignment determined by the image processor 120, in response to the control signal from the image processor 120. The fixation light 145 can guide the patient 8 to reduce an angular or a lateral misalignment.
In an implementation, the fixation light source 140 may include a collimator 142 to generate the fixation light 145 to guide the patient 8 to reduce a lateral misalignment. The fixation light 145 can be generated for the non-imaged, or control eye 1c, and the fixation light 145 can be adjusted according to the determined misalignment to assist the patient to reduce the misalignment. In other implementations, the fixation light 145 can be generated for the imaged eye 1i.
The generating the misalignment-reduction response 230 can include the gantry controller 150 moving the gantry 155 of the imaging device 110 to reduce a lateral misalignment. In other embodiments, the gantry controller 150 can move the bed 168, or a combination of the bed 168 and the gantry 155.
The determining the misalignment 220 can include determining an angular and a lateral misalignment by the image processor 120 processing the image and an additional misalignment information. Correspondingly, the generating the misalignment-reduction response 230 can include operating the fixation light system 140 and the gantry controller 150 to reduce the angular and the lateral misalignment.
The determining the misalignment 220 can include (1) projecting the locator light 175 onto the imaged eye 1i by the locator light source 170, (2) locating an image 177i of the apical reflected locator light 177 in the image generated by the imaging device 110, and (3) determining the misalignment information using the located imaged apical reflected locator light 177i.
The determining the misalignment information 220 can include determining an angular misalignment information, related to a distance or vector between the image of the apical reflected locator light 177i and a location of an imaged ophthalmic structure; and determining a lateral misalignment information, related to a distance or vector between the imaged apical reflected locator light 177i or the location of the imaged ophthalmic structure and a reference of the imaging system. The generating the misalignment-reduction response 230 can include reducing the angular misalignment by adjusting the fixation light system 140 and reducing the lateral misalignment by operating the gantry controller 150. As the first phase of reducing the misalignment may only reduce the misalignment but not eliminate it, the reducing the angular misalignment and the reducing the lateral misalignment phases can be repeated iteratively and alternately in some implementations.
In some embodiments, the generating the misalignment-reduction response 230 can include using the locator light as a second fixation light 175. In these embodiments, the reducing the lateral and the angular misalignment can include instructing the patient 8 to align the first fixation light 145 and the locator/second fixation light 175.
Finally, some implementations of the ophthalmic imaging system may include an imaging device that generates an image of an imaged eye of the patient and a processor that determines a misalignment of the imaged eye and the imaging device by processing the generated image. The processor can control a fixation light system to project a fixation light on an eye of the patient to reduce an angular misalignment, and control a gantry to adjust a movable optical element of the system to reduce a lateral misalignment.
Some implementation of the ophthalmic imaging system can include an indicator light system that projects an indicator light on the imaged eye to provide misalignment information for the processor.
While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what can be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features can be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination can be directed to a subcombination or variation of a subcombination.
Number | Name | Date | Kind |
---|---|---|---|
4164222 | Prokhorov et al. | Aug 1979 | A |
4198143 | Karasawa | Apr 1980 | A |
4235529 | Kawase et al. | Nov 1980 | A |
4465348 | Lang | Aug 1984 | A |
4520816 | Schachar et al. | Jun 1985 | A |
4533222 | Ishikawa | Aug 1985 | A |
4538608 | L'Esperance, Jr. | Sep 1985 | A |
4554917 | Tagnon | Nov 1985 | A |
4638801 | Daly et al. | Jan 1987 | A |
4764005 | Webb et al. | Aug 1988 | A |
4881808 | Bille et al. | Nov 1989 | A |
4901718 | Bille et al. | Feb 1990 | A |
4907586 | Bille et al. | Mar 1990 | A |
5048946 | Sklar et al. | Sep 1991 | A |
5049147 | Danon | Sep 1991 | A |
5054907 | Sklar et al. | Oct 1991 | A |
5098426 | Sklar et al. | Mar 1992 | A |
5112328 | Taboada et al. | May 1992 | A |
5139022 | Lempert | Aug 1992 | A |
5246435 | Bille et al. | Sep 1993 | A |
5255025 | Volk | Oct 1993 | A |
5286964 | Fountain | Feb 1994 | A |
5321501 | Swanson et al. | Jun 1994 | A |
5336215 | Hsueh et al. | Aug 1994 | A |
5391165 | Fountain et al. | Feb 1995 | A |
5439462 | Bille et al. | Aug 1995 | A |
5493109 | Wei et al. | Feb 1996 | A |
5549632 | Lai | Aug 1996 | A |
5656186 | Mouron et al. | Aug 1997 | A |
5738676 | Hammer et al. | Apr 1998 | A |
5779696 | Berry et al. | Jul 1998 | A |
5795295 | Hellmuth et al. | Aug 1998 | A |
5936706 | Takagi | Aug 1999 | A |
5954648 | Van Der Brug | Sep 1999 | A |
5954711 | Ozaki et al. | Sep 1999 | A |
5994690 | Kulkarni et al. | Nov 1999 | A |
6004314 | Wei et al. | Dec 1999 | A |
6095648 | Birngruber et al. | Aug 2000 | A |
6099522 | Knopp et al. | Aug 2000 | A |
6137585 | Hitzenberger et al. | Oct 2000 | A |
6254595 | Juhasz et al. | Jul 2001 | B1 |
6288784 | Hitzenberger et al. | Sep 2001 | B1 |
6314311 | Williams et al. | Nov 2001 | B1 |
6317616 | Glossop | Nov 2001 | B1 |
6337925 | Cohen et al. | Jan 2002 | B1 |
6377349 | Fercher | Apr 2002 | B1 |
6379005 | Williams et al. | Apr 2002 | B1 |
6451009 | Dasilva et al. | Sep 2002 | B1 |
6454761 | Freedman | Sep 2002 | B1 |
6497701 | Shimmick et al. | Dec 2002 | B2 |
6529758 | Shahidi | Mar 2003 | B2 |
6579282 | Bille et al. | Jun 2003 | B2 |
6623476 | Juhasz et al. | Sep 2003 | B2 |
6687010 | Horii et al. | Feb 2004 | B1 |
6730074 | Bille et al. | May 2004 | B2 |
6741359 | Wei et al. | May 2004 | B2 |
6751033 | Goldstein et al. | Jun 2004 | B2 |
6755819 | Waelti | Jun 2004 | B1 |
6763259 | Hauger et al. | Jul 2004 | B1 |
6769769 | Podoleanu et al. | Aug 2004 | B2 |
6775007 | Izatt et al. | Aug 2004 | B2 |
6863667 | Webb et al. | Mar 2005 | B2 |
6887232 | Bille | May 2005 | B2 |
6899707 | Scholler et al. | May 2005 | B2 |
6932807 | Tomita et al. | Aug 2005 | B1 |
6991629 | Juhasz et al. | Jan 2006 | B1 |
7006232 | Rollins et al. | Feb 2006 | B2 |
7018376 | Webb et al. | Mar 2006 | B2 |
7027233 | Goldstein et al. | Apr 2006 | B2 |
7061622 | Rollins et al. | Jun 2006 | B2 |
7072045 | Chen et al. | Jul 2006 | B2 |
7072047 | Westphal et al. | Jul 2006 | B2 |
7079254 | Kane et al. | Jul 2006 | B2 |
7102756 | Izatt et al. | Sep 2006 | B2 |
7113818 | Podoleanu et al. | Sep 2006 | B2 |
7126693 | Everett et al. | Oct 2006 | B2 |
7130054 | Ostrovsky et al. | Oct 2006 | B2 |
7133137 | Shimmick | Nov 2006 | B2 |
7139077 | Podoleanu et al. | Nov 2006 | B2 |
7145661 | Hitzenberger | Dec 2006 | B2 |
7148970 | de Boer | Dec 2006 | B2 |
7184148 | Alphonse | Feb 2007 | B2 |
7207983 | Hahn et al. | Apr 2007 | B2 |
7248371 | Chan et al. | Jul 2007 | B2 |
7268885 | Chan et al. | Sep 2007 | B2 |
7280221 | Wei | Oct 2007 | B2 |
7307733 | Chan et al. | Dec 2007 | B2 |
7310150 | Guillermo et al. | Dec 2007 | B2 |
7312876 | Chan et al. | Dec 2007 | B2 |
7319566 | Prince et al. | Jan 2008 | B2 |
7329002 | Nakanishi | Feb 2008 | B2 |
7330270 | O'Hara et al. | Feb 2008 | B2 |
7330273 | Podoleanu et al. | Feb 2008 | B2 |
7335223 | Obrebski | Feb 2008 | B2 |
7336366 | Choma et al. | Feb 2008 | B2 |
7342659 | Horn et al. | Mar 2008 | B2 |
7347548 | Huang et al. | Mar 2008 | B2 |
7352444 | Seams et al. | Apr 2008 | B1 |
7355716 | de Boer et al. | Apr 2008 | B2 |
7364296 | Miller et al. | Apr 2008 | B2 |
7365856 | Everett et al. | Apr 2008 | B2 |
7365859 | Yun et al. | Apr 2008 | B2 |
7370966 | Fukuma et al. | May 2008 | B2 |
7371230 | Webb et al. | May 2008 | B2 |
7372578 | Akiba et al. | May 2008 | B2 |
7388672 | Zhou et al. | Jun 2008 | B2 |
7390089 | Loesel et al. | Jun 2008 | B2 |
7400410 | Baker et al. | Jul 2008 | B2 |
7402159 | Loesel et al. | Jul 2008 | B2 |
7426037 | Ostrovsky et al. | Sep 2008 | B2 |
7433046 | Everett et al. | Oct 2008 | B2 |
7452077 | Meyer et al. | Nov 2008 | B2 |
7452080 | Wiltberger et al. | Nov 2008 | B2 |
7461658 | Jones et al. | Dec 2008 | B2 |
7466423 | Podoleanu et al. | Dec 2008 | B2 |
7470025 | Iwanaga | Dec 2008 | B2 |
7477764 | Haisch | Jan 2009 | B2 |
7480058 | Zhao et al. | Jan 2009 | B2 |
7480059 | Zhou et al. | Jan 2009 | B2 |
7488070 | Hauger et al. | Feb 2009 | B2 |
7488930 | Ajgaonkar et al. | Feb 2009 | B2 |
7492466 | Chan et al. | Feb 2009 | B2 |
7503916 | Shimmick | Mar 2009 | B2 |
7508525 | Zhou et al. | Mar 2009 | B2 |
7535577 | Podoleanu et al. | May 2009 | B2 |
7537591 | Feige et al. | May 2009 | B2 |
7557928 | Ueno | Jul 2009 | B2 |
7575322 | Somani | Aug 2009 | B2 |
7593559 | Toth et al. | Sep 2009 | B2 |
7602500 | Izatt et al. | Oct 2009 | B2 |
7604351 | Fukuma et al. | Oct 2009 | B2 |
7614744 | Abe | Nov 2009 | B2 |
7630083 | de Boer et al. | Dec 2009 | B2 |
7631970 | Wei | Dec 2009 | B2 |
7633627 | Choma et al. | Dec 2009 | B2 |
7643152 | de Boer et al. | Jan 2010 | B2 |
7797119 | De Boer et al. | Sep 2010 | B2 |
7813644 | Chen et al. | Oct 2010 | B2 |
7898712 | Adams et al. | Mar 2011 | B2 |
8262646 | Frey et al. | Sep 2012 | B2 |
8394084 | Palankar et al. | Mar 2013 | B2 |
20010022648 | Lai | Sep 2001 | A1 |
20020013574 | Elbrecht et al. | Jan 2002 | A1 |
20020082466 | Han | Jun 2002 | A1 |
20020097374 | Payne et al. | Jul 2002 | A1 |
20020133145 | Gerlach et al. | Sep 2002 | A1 |
20020198516 | Knopp | Dec 2002 | A1 |
20030090674 | Zeylikovich et al. | May 2003 | A1 |
20030206272 | Cornsweet et al. | Nov 2003 | A1 |
20040039378 | Lin | Feb 2004 | A1 |
20040059321 | Knopp et al. | Mar 2004 | A1 |
20040151466 | Crossman-Bosworth et al. | Aug 2004 | A1 |
20040243233 | Phillips | Dec 2004 | A1 |
20050010109 | Faul | Jan 2005 | A1 |
20050015120 | Seibel et al. | Jan 2005 | A1 |
20050021011 | LaHaye | Jan 2005 | A1 |
20050173817 | Fauver et al. | Aug 2005 | A1 |
20050192562 | Loesel et al. | Sep 2005 | A1 |
20050201633 | Moon et al. | Sep 2005 | A1 |
20050203492 | Nguyen et al. | Sep 2005 | A1 |
20050215986 | Chernyak et al. | Sep 2005 | A1 |
20050284774 | Mordaunt | Dec 2005 | A1 |
20050286019 | Wiltberger et al. | Dec 2005 | A1 |
20050288745 | Andersen et al. | Dec 2005 | A1 |
20060020172 | Luerssen et al. | Jan 2006 | A1 |
20060077346 | Matsumoto | Apr 2006 | A1 |
20060100613 | McArdle et al. | May 2006 | A1 |
20060179992 | Kermani | Aug 2006 | A1 |
20060187462 | Srinivasan et al. | Aug 2006 | A1 |
20060195076 | Blumenkranz et al. | Aug 2006 | A1 |
20060206102 | Shimmick | Sep 2006 | A1 |
20070013867 | Ichikawa | Jan 2007 | A1 |
20070121069 | Andersen et al. | May 2007 | A1 |
20070126985 | Wiltberger et al. | Jun 2007 | A1 |
20070129709 | Andersen et al. | Jun 2007 | A1 |
20070129775 | Mordaunt et al. | Jun 2007 | A1 |
20070147730 | Wiltberger et al. | Jun 2007 | A1 |
20070173791 | Raksi | Jul 2007 | A1 |
20070173794 | Frey et al. | Jul 2007 | A1 |
20070173795 | Frey et al. | Jul 2007 | A1 |
20070185475 | Frey et al. | Aug 2007 | A1 |
20070189664 | Andersen et al. | Aug 2007 | A1 |
20070216909 | Everett et al. | Sep 2007 | A1 |
20070219541 | Kurtz | Sep 2007 | A1 |
20070230520 | Mordaunt et al. | Oct 2007 | A1 |
20070282313 | Huang et al. | Dec 2007 | A1 |
20070291277 | Everett et al. | Dec 2007 | A1 |
20070299429 | Amano | Dec 2007 | A1 |
20080033406 | Andersen et al. | Feb 2008 | A1 |
20080049188 | Wiltberger et al. | Feb 2008 | A1 |
20080055543 | Meyer et al. | Mar 2008 | A1 |
20080056610 | Kanda | Mar 2008 | A1 |
20080071254 | Lummis et al. | Mar 2008 | A1 |
20080088795 | Goldstein et al. | Apr 2008 | A1 |
20080100612 | Dastmalchi et al. | May 2008 | A1 |
20080281303 | Culbertson et al. | Nov 2008 | A1 |
20080281413 | Culbertson et al. | Nov 2008 | A1 |
20080319427 | Palanker | Dec 2008 | A1 |
20090012507 | Culbertson et al. | Jan 2009 | A1 |
20090088734 | Mordaunt | Apr 2009 | A1 |
20090125005 | Chernyak et al. | May 2009 | A1 |
20090131921 | Kurtz et al. | May 2009 | A1 |
20090149742 | Kato et al. | Jun 2009 | A1 |
20090157062 | Hauger et al. | Jun 2009 | A1 |
20090161827 | Gertner et al. | Jun 2009 | A1 |
20090168017 | O'Hara et al. | Jul 2009 | A1 |
20090177189 | Raksi | Jul 2009 | A1 |
20090268161 | Hart et al. | Oct 2009 | A1 |
20100004641 | Frey et al. | Jan 2010 | A1 |
20100004643 | Frey et al. | Jan 2010 | A1 |
20100007848 | Murata | Jan 2010 | A1 |
20100022994 | Frey et al. | Jan 2010 | A1 |
20100022995 | Frey et al. | Jan 2010 | A1 |
20100022996 | Frey et al. | Jan 2010 | A1 |
20100042079 | Frey et al. | Feb 2010 | A1 |
20100110377 | Maloca et al. | May 2010 | A1 |
20100208199 | Levis et al. | Aug 2010 | A1 |
20100324543 | Kurtz et al. | Dec 2010 | A1 |
20110022036 | Frey et al. | Jan 2011 | A1 |
20110118609 | Goldshleger et al. | May 2011 | A1 |
20110196350 | Friedman et al. | Aug 2011 | A1 |
20110202044 | Goldshleger et al. | Aug 2011 | A1 |
20110222020 | Izatt et al. | Sep 2011 | A1 |
20110319873 | Raksi et al. | Dec 2011 | A1 |
20120274903 | Sayeram et al. | Nov 2012 | A1 |
Number | Date | Country |
---|---|---|
1444946 | Aug 2004 | EP |
2322083 | May 2011 | EP |
4-503913 | Jul 1992 | JP |
2002345758 | Dec 2002 | JP |
2004-531344 | Oct 2004 | JP |
2009-523556 | Jun 2009 | JP |
9009141 | Aug 1990 | WO |
9808048 | Feb 1998 | WO |
03002008 | Jan 2003 | WO |
03062802 | Jul 2003 | WO |
2006074469 | Jul 2006 | WO |
2007084694 | Jul 2007 | WO |
2007106326 | Sep 2007 | WO |
2007130411 | Nov 2007 | WO |
2010075571 | Jul 2010 | WO |
Entry |
---|
Arimoto et al., “Imaging Properties of Axicon in a Scanning Optical System,” Nov. 1, 1992, Applied Optics, 31 (31):6652-6657. |
Birngruber et al., “In-Vivo Imaging of the Development of Linear and Non-Linear Retinal Laser Effects Using Optical Coherence Tomography in Correlation with Histopathological Findings,” 1995, Proc. SPIE 2391:21-27. |
Chinn, S.R., et al., “Optical coherence tomography using a frequency-tunable optical source”, Optics Letters, 22 (5):340-342, Mar. 1997. |
European Search Report, European Patent Application No. 10191057.8, mailed Mar. 16, 2011, to be published by the USPTO. |
Fercher et al., “Eye-Length Measurement by Interferometry With Partially Coherent Light,” Mar. 1988, Optics Letters, 13(3):186-188. |
Fercher et al., “Measurement of Intraocular Distances by Backscattering Spectral Interferometry,” May 15, 1995, Optics Comm. 117:43-48. |
Hee, M., et al., “Femotosecond transillumination optical coherence tomography”, Optics Letters, 18(12):950-952, Jun. 1993. |
Huber, R., et al., “Three-dimensional and C-mode OCT imaging with a compact, frequency swept laser source at 1300 nm”, Optics Express, 13(26):10523-10538, Dec. 2005. |
Izatt et al., “Micron-Resolution Biomedical Imaging With Optical Coherence Tomography,” Oct. 1993, Optics & Photonics News, pp. 14-19. |
Kamensky, V., et al., “In situ monitoring of the middle IR laser ablation of a cataract-suffered human lens by optical coherent tomography”, Proc. SPIE, 2930:222-229, 1996. |
Kamensky, V., et al., “Monitoring and animation of laser ablation process in cataracted eye lens using coherence tomography”, Proc. SPIE, 2981:94-102, 1997. |
Massow, O., et al., “Optical coherence tomography controlled femtosecond laser microsurgery system”, Proceedings of the SPIE—Optical Coherence Tomography and Coherence Techniques III, vol. 6627, pp. 662717(1)-662717(6), Aug. 2007. |
Ohmi, M., et al., “In-situ Observation of Tissue Laser Ablation Using Optical Coherence Tomography”, Optical and Quantum Electronics, 37(13-15):1175-1183, Dec. 2005. |
PCT International Search Report for International Application No. PCT/US2011/023710 mailed Aug. 24, 2011. |
PCT International Search Report for International Application No. PCT/US2011/025332 mailed Sep. 16, 2011. |
PCT International Search Report for International Application No. PCT/US2010/056701 mailed Jan. 12, 2011. |
PCT International Search Report for International Application No. PCT/US2008/075511 mailed Mar. 12, 2009. |
Sarunic, M., et al., “Instantaneous complex conjugate resolved spectral domain and swept-source OCT using 3×3 fiber couplers”, Optics Express, 13(3):957-967, Feb. 2005. |
Sarunic, M., et al., “Real-time quadrature projection complex conjugate resolved Fourier domain optical coherence tomography”, Optics Letters, 31(16):2426-2428, Aug. 2006. |
Sarunic, M., et al., “Imaging the Ocular Anterior Segment With Real-Time, Full-Range Fourier-Domain Optical Coherence Tomography”, Archives of Ophthalmology, 126(4):537-542, Apr. 2008. |
Stern et al., “Femtosecond Optical Ranging of Corneal Incision Depth,” Jan. 1989, Investigative Ophthalmology & Visual Science, 30(1):99-104. |
Swanson et al., “In vivo retinal imaging by optical coherence tomography”, Optics Letters, 18(21), 1864-1866, Nov. 1993. |
Tao, Y., et al., “High-speed complex conjugate resolved retinal spectral domain optical coherence tomography using sinusoidal phase modulation”, Optics letters, 32(20):2918-2920, Oct. 2007. |
Wojtkowski et al., “In Vivo Human Retinal Imaging by Fourier Domain Optical Coherence Tomography,” Jul. 2002, Journal of Biomedical Optics 7(3):457-463, 7 pages. |
Yun, S.H., et al., “Wavelength-swept fiber laser with frequency shifted feedback and resonantly swept intra-cavity acoustooptic tunable filter”, IEEE Journal of Selected Topics in Quantum Electronics, 3(4):1087-1096, Aug. 1997. |
PCT International Search Report corresponding to PCT Application Serial No. PCT/US2011/051466 dated Apr. 10, 2012. |
Partial International Search Report corresponding to PCT Application Serial No. PCT/US2012/035927 dated Aug. 3, 2012. |
PCT International Search Report dated Dec. 7, 2012 for International Application No. PCT/US2012/035927, filed May 1, 2012. |
PCT International Search Report and Written Opinion dated Feb. 9, 2012 for International Application Serial No. PCT/US2011/040223. |
Aslyo-Vogel et al., “Darstellung von LTK-Lasionen durch optische Kurzkohärenztomographie (OCT) and Polarisationsmikroskopie nach Sirius-Rot-Fäbung”, Ophthalmologe, pp. 487-491, 7-97. |
Bagayev et al., “Optical coherence tomography for in situ monitoring of laser corneal ablation”, Journal of Biomedical Optics, 7(4), pp. 633-642 (Oct. 2002). |
Blaha et al., “The slit lamp and the laser in ophthalmology — a new laser slit lamp”, SPIE Optical Instrumentation for Biomedical Laser Applications, vol. 658, pp. 38-42, 1986. |
Boppart, S., et al., “Intraoperative Assessment of Microsurgery with Three-dimensional Optical Coherence Tomography”, Radiology, 208(1):81-86, Jul. 1998. |
Davidson, “Analytic Waveguide Solutions and the Coherence Probe Microscope”, Microelectronic Engineering, 13, pp. 523-526, 1991. |
Drexler, W., et al., “Measurement of the thickness of fundus layers by partial coherence tomography”, Optical Engineering, 34(3):701-710, Mar. 1995. |
Dyer, P., et al., “Optical Fibre Delivery and Tissue Ablation Studies using a Pulsed Hydrogen Fluoride Laser”, Lasers in Medical Science, 7:331-340, 1992. |
Fercher et al., “In Vivo Optical Coherence Tomography”, American Journal of Ophthalmology, 116(1), pp. 113-114, 1993. |
Fujimoto, J., et al., :Biomedical Imaging using Optical Coherent Tomography, 1994, 67. |
Hammer, D., “Ultrashort pulse laser induced bubble creation thresholds in ocular media”, SPIE, 2391:30-40, 1995. |
Hauger, C., et al., “High speed low coherence interferometer for optical coherence tomography”, Proceedings of SPIE, 4619:1-9, 2002. |
Hee, M., et al., “Optical Coherence tomography of the Human Retina”, Arch Ophthalmol, 113:325-332; Mar. 1995. |
Hitzenberger et al., “Interferometric Measurement of Corneal Thickness With Micrometer Precision”, American Journal of Ophthalmology, 118:468-476, Oct. 1994. |
Hitzenberger, C., et al., “Retinal layers located with a precision of 5 μm by partial coherence interferometry”, SPIE, 2393:176-181, 1995. |
Itoh et al., “Absolute measurements of 3-D shape using white-light interferometer”, SPIE Interferometry: Techniques and Analysis, 1755:24-28, 1992. |
Izatt et al., “Ophthalmic Diagnostics using Optical Coherence Tomography”, SPIE Ophthalmic Technologies, 1877:136-144, 1993. |
Izatt, J., et al., “Micrometer-Scale Resolution Imaging of the Anterior Eye in vivo With Optical Coherence Tomography”, Arch Ophthalmol, 112:1584-1589, Dec. 1994. |
Jean, B., et al., “Topography assisted photoablation”, SPIE, vol. 3591:202-208, 1999. |
Kamensky, V., et al., “In Situ Monitoring of Laser Modification Process in Human Cataractous Lens and Porcine Cornea Using Coherence Tomography”, Journal of biomedical Optics, 4(1), 137-143, Jan 1999. |
Lee et al., “Profilometry with a coherence scanning microscope”, Applied Optics, 29(26), 3784-3788, Sep. 10, 1990. |
Lubatschowski, “The German Ministry of Research and education funded this OCT guided fs laser surgery in Sep. 2005”, http://www.laser-zentrum-hannoverde/download/pdf/taetigkeitsbericht2005.pdf. |
Massow, O., et al., “Femotosecond laser microsurgery system controlled by OCT”, Laser Zentrum Hannover e.V., The German Ministry of education and research, 19 slides, 2007. |
Puliafito, Carmen, “Final technical Report: Air Force Grant #F49620-93-I-03337(1)” dated Feb. 12, 1997, 9 pages. |
Ren, Q., et al., “Axicon: A New Laser Beam Delivery System for Corneal Surgery”, IEEE Journal of Quantum Electronics, 26(12):2305-2308, Dec. 1990. |
Ren, Q., et al., “Cataract Surgery with a Mid-Infrared Endo-laser System”, SPIE Ophthalmic Technologies II, 1644:188-192, 1992. |
Thompson, K., et al., “Therapeutic and Diagnostic Application of Lasers in Ophthalmology”, Proceedings of the IEEE, 80(6):838-860, Jun. 1992. |
Thrane, L, et al., “Calculation of the maximum obtainable probing depth of optical coherence tomography in tissue”, Proceedings of SPIE, 3915:2-11, 2000. |
Wisweh, H., et al., “OCT controlled vocal fold femtosecond laser microsurgery”, Laser Zentrum Hannover e.V., The German Ministry of education and research, Grants: 13N8710 and 13N8712; 23 slides, 2008. |
Number | Date | Country | |
---|---|---|---|
20120281185 A1 | Nov 2012 | US |