The present invention relates to one or more imaging systems including at least one light source, optics, and at least one camera for capturing and recording images of a patient's eye. The invention further relates to a system and methods for allowing an ophthalmologist to easily and conveniently recreate the slit-lamp examination by accessing the captured images.
Ophthalmologists use a variety of devices for imaging of a patient's eye, including slit-lamps, ophthalmoscopes, fundus cameras, and scanning laser ophthalmoscopes (SLOs). The ophthalmic slit-lamp examination has remained largely unchanged for over sixty years. The slit lamp is a versatile instrument used by ophthalmologists for examining a patient's eye. It consists of a microscope, an illumination source, and a mechanical support system to facilitate positioning the illumination source at various angles with respect to the eye. Ophthalmologists and optometrists typically examine the eye by first horizontally scanning across the eye using various slit beam thicknesses and orientations to examine the most anterior structures such as the cornea and conjunctiva. Then the examiner will adjust the focus plane posterior to horizontally scan across the anterior chamber of the eye. The focus is then adjusted more posteriorly to horizontally scan across the iris and anterior crystalline lens. The process is repeated again to examine the posterior aspect of the crystalline lens and anterior vitreous.
Anterior segment ocular imaging (e.g., slit-lamp) photography allows ophthalmologists to document and record a given slit-lamp view of an eye. Similarly, slit-lamp video allows ophthalmologists to document and record a slit-lamp examination of a patient's eye. Traditional slit-lamp photography creates an image using a sensor placed in an optical system at a plane optically conjugate to an object which is to be imaged. This is the plane at which the best focus is achieved and therefore the best optical resolution of features in the object results.
Most still and video photography slit-lamp units are created by mounting a camera in place of the viewing oculars or in conjunction with the viewing oculars through the means of a beam splitter. These traditional modalities of recording the slit-lamp exam are limited to either using still photography to capture a single moment of the examination, or taking a video of one's own examination sequence of slit-beam focus, magnification, slit-beam height, width and angle of incidence. Another health care professional can view the video, but cannot alter any of these variables after the examination. Slit-lamp video also requires a highly trained ophthalmologist or optometrist to perform the examination. No system exists that allows an ophthalmologist or optometrist to perform a virtual slit-lamp examination based on images obtained at an earlier time. Such a system using traditional cameras would require a massive library of images of various slit-beam positions and characteristics would be required, with numerous sequential images stored in at least the x- and z-axes.
A camera captures an image of the illuminated portion of the eye structures via reflected light. Rays which emanate from a point within the object plane in multiple directions are captured by the optical system and those rays converge to approximately a single point in the conjugate image plane. The set of rays which are summed at any image point is generally constrained by physical apertures placed within the optical assembly. The traditional sensor records the summation of the intensity of light in the plane of the detector. The measurement contains the intensity distribution of light within the plane of the sensor but loses all information about the rays' direction before the summation. Therefore the typical process of recording a traditional image does not record a very large fraction of the information contained in the light absorbed.
In an exemplary embodiment of the present disclosure, an imaging system for imaging at least a portion of an eye of a patient is provided. The system comprising a patient support adapted to position the eye of the patient; a movable base moveable relative to the patient support; and an illumination system. The illumination system including at least one light source producing light to illuminate the eye and an illumination system support arm supporting the light source. The illumination system support arm being supported by the moveable base and rotatable relative to the moveable base. The system further comprising an observation system including a plenoptic camera configured to receive imaging rays produced by reflection of light from the eye, and an observation system support arm supporting the imaging system. The observation system support arm being supported by the moveable base and rotatable relative to the moveable base. The observation system further comprising a storage device operatively coupled to the plenoptic camera to receive and store a plurality of images of the eye imaged by the plenoptic camera, each of the stored images having at least one associated component characteristic of one of the patient support, the movable base, the illumination system, and the observation system. In one example, the illumination system further includes a slit forming device which receives illuminating light produced by the at least one light source and provides a line of light to illuminate the eye, the illumination system support arm supporting the slit forming device and wherein the plenoptic camera receives imaging rays produced by reflection of the line of light from the eye. In another example, the illumination system includes a plurality of light sources arranged in an array, the plurality of light sources each produce light to illuminate the eye. In a variation thereof, an illumination characteristic of a portion of the plurality of light sources is adjusted through an input device. In a refinement thereof, the illumination characteristic is one of an intensity level and a wavelength spectrum. In another variation thereof, an illumination characteristic of a portion of the plurality of light sources is adjusted through an electronic controller. In a refinement thereof, the illumination characteristic is one of an intensity level and a wavelength spectrum. In a further example, the observation system support arm is rotatable relative to the moveable base independent of the illumination system support arm. In yet a further example, the illumination system support arm is rotatable relative to the moveable base about a first rotation axis and the observation system support arm is rotatable relative to the moveable base about the first rotation axis.
In another exemplary embodiment, a method of analyzing an eye of a patient which has been illuminated with a slit-lamp microscope is provided. The slit-lamp microscope including an illumination system and an observation system. The illumination system including a light source and a slit forming device which provides a line of light to illuminate the eye and the observation system including an imaging system including a plenoptic camera configured to receive imaging rays produced by reflection of the line of light from the eye. The method comprising the steps of storing a plurality of images of the eye imaged by the plenoptic camera while the eye was illuminated with the line of light, each of the stored images having at least one associated slit-lamp microscope characteristic; receiving an image request; and providing a requested image based on at least one of the plurality of images, the image request, and the at least one associated slit-lamp microscope characteristic of the at least one of the plurality of images. In one example, the requested image includes the line of light focused on a first portion of a curved structure. In another example, the method further comprises the steps of receiving an image request for a second image having the line of light focused on a second portion of the curved structure, wherein the line of light is displaced in at least one of an x-axis direction and a y-axis direction and in a z-axis direction; and generating the second image from at least one of the stored images and the light field data of the at least one stored image. In a further example, the method further comprises the step of requesting to walk through the stored images sequentially. In yet a further example, the method further comprises the steps of retrieving an image set from a prior examination; and identifying an image from the prior examination having the same associated slit-lamp microscope characteristic as the requested image. In yet a further example, the associated slit-lamp microscope characteristic is one or more of an x-axis position of a moveable base of the slit-lamp supporting the illumination system and the observation system, a y-axis position of the moveable base, a z-axis position of the moveable base, a rotational position of the illumination system, a rotational position of the observation system, a slit width of the slit-forming device, and a magnification of the observation system. In still yet another example, the method further comprises the steps of receiving an image request for a second image having the line of light focused on at a different depth within the eye than the first image; and generating the second image from at least one of the stored images and the light field data of the at least one stored image.
In yet another exemplary embodiment of the present disclosure, an imaging system for imaging at least a portion of an eye of a patient is provided. The system comprising a patient support adapted to position the eye of the patient; an illumination system including a light source producing light to illuminate the eye; and an observation system including a plurality of cameras in a spaced apart arrangement, each camera positioned to receive imaging rays produced by reflection of light from the eye. In one example, each camera has an optical axis and the plurality of optical axes are parallel. In another example, the plurality of cameras are arranged along a line generally perpendicular to the optical axes of the plurality of cameras. In a further example, each camera has an optical axis and the plurality of optical axes converge towards a common point. In a variation thereof, the plurality of cameras are arranged along an arc. In a refinement thereof, the arc is a circular arc and the common point is a center of the circular arc. In still another example, the plurality of cameras are plenoptic cameras.
In a further exemplary embodiment of the present disclosure, a method of analyzing an eye of a patient is provided. The method comprising the steps of illuminating the eye with an illumination system, the illumination system including a light source and a slit forming device which provides a line of light to illuminate the eye; positioning a first camera relative to the eye to receive imaging rays produced by a reflection of the line of light from the eye; positioning a second camera relative to the eye to receive imaging rays produced by the reflection of the line of the light from the eye; and storing a plurality of images of the eye imaged by the first camera and the second camera while the eye was illuminated with the line of light. In one example, each of the first camera and the second camera have an optical axis which are parallel to each other. In a variation thereof, the first camera and the second camera are arranged along a line generally perpendicular to the optical axes of the first camera and the second camera. In another example, each of the first camera and the second camera have an optical axis that converge towards a common point. In another variation thereof, the first camera and the second camera are arranged along an arc. In a refinement thereof, the arc is a circular arc and the common point is a center of the circular arc. In a further refinement thereof, the plurality of cameras are plenoptic cameras.
In yet a further exemplary embodiment of the present disclosure, an imaging system for imaging at least a portion of an eye of a patient is provided. The system comprising a patient support adapted to position the eye of the patient; an illumination system including a light source producing light to illuminate the eye; and an observation system including imaging optics configured to receive imaging rays produced by reflection of light from the eye which are focused by the imaging optics at a first object plane, a first observation unit including a viewfinder which receives imaging rays from the imaging optics and a second observation unit which receives the imaging rays from the imaging optics, the second observation unit including a plenoptic camera and a display, the second observation unit displaying an image of the eye generated based on the imaging rays, the image of the eye being focused at a second object plane spaced apart from the first object plane. In one example, the imaging system further comprises a beamsplitter, the imaging rays reaching the viewfinder through a first path through the beamsplitter and reaching the plenoptic camera through a second path through the beamsplitter. In another example, the first object plane is offset from the second object plane. In a further example, the illumination system includes a plurality of light sources arranged in an array, the plurality of light sources each produce light to illuminate the eye. In a variation thereof, an illumination characteristic of a portion of the plurality of light sources is adjusted through an input device. In a refinement thereof, the illumination characteristic is one of an intensity level and a wavelength spectrum.
In yet still another exemplary embodiment of the present disclosure, a method of analyzing an eye of a patient is provided. The method comprising the steps of illuminating the eye with an illumination system; receiving with imaging optics imaging rays produced by reflection of light from the eye; directing the imaging rays to a viewfinder; directing the imaging ray to a plenoptic camera; focusing the imaging optics on a first object plane in the eye; and displaying on a display operatively coupled to the plenoptic camera a second object plane in the eye. In one example, the first object plane is offset from the second object plane. In a variation thereof, the first object plane take into account at least one of an optical power of the viewfinder and the optical power of an operator's eyes such that the resultant image viewed by the operator through the viewfinder is focused at the second object plane.
In still a further exemplary embodiment of the present disclosure, an imaging system for imaging at least a portion of a left eye of a patient and at least a portion of a right eye of the patient is provided. The system comprising a patient support adapted to position the left eye and the right eye of the patient; at least one illumination system including at least one light source producing light to illuminate the left eye and the right eye; a first observation system including a first plenoptic camera configured to receive imaging rays produced by reflection of light from the left eye; a second observation system including a second plenoptic camera configured to receive imaging rays produced by reflection of light from the right eye; and a storage device operatively coupled to the first plenoptic camera and to the second plenoptic camera to receive and store a plurality of images of the eye imaged by the first plenoptic camera and the second plenoptic camera. In one example, the at least one illumination system includes a first illumination system including at least a first light source producing light to illuminate the left eye and a second illumination system including at least a second light source producing light to illuminate the right eye.
In a further exemplary embodiment of the present disclosure, a method of analyzing an eye of a patient with an imaging system including an illumination system and an observation system is provided. The illumination system includes a light source. The observation system including an imaging system including a camera configured to receive imaging rays produced by reflection of light from the eye. The method comprising the steps of capturing images of a portion of the eye over time with the camera; monitoring a position of a structure of the eye in the captured images; determining if the structure of the eye is moving towards an unsafe location; and if the structure is moving towards an unsafe location, providing feedback of such movement. In one example, the method further comprises the step of providing a signal to inhibit operation of an instrument which is used to alter a portion of the eye. In a variation thereof, the instrument is an ultrasound probe. In another example, the step providing feedback of such movement includes at least one of providing an audio output, providing a visual output, and providing a tactile output. In a further example, the camera is a plenoptic camera. In a variation thereof, the structure is a posterior capsule of the eye and the step of determining if the structure of the eye is moving towards the unsafe location includes the step of determining if the posterior capsule is moving forward towards the anterior side of the eye. In a refinement thereof, the step of determining if the structure of the eye is moving towards the unsafe location includes the step of determining whether the movement of the structure has exceeded a threshold amount. In yet a further example, the step of determining if the structure of the eye is moving towards the unsafe location includes the step of determining whether the movement of the structure has exceeded a threshold amount.
In a yet further exemplary embodiment of the present disclosure, a method of analyzing an eye of a patient with an imaging system including an illumination system and an observation system is provided. The illumination system includes a light source. The observation system including a camera configured to receive imaging rays produced by reflection of light from the eye. The method comprising the steps of capturing images of a portion of the eye over time with a plenoptic camera; determining positions of one of more structures of the eye from the captured images; and identifying a first intraocular lens from a library of intraocular lenses for placement in the eye based on the determined positions. In one example, the step of identifying the first intraocular lens from the library of intraocular lenses for placement in the eye based on the determined positions includes the step of comparing the determined positions of the one or more structures of the eye with a database of determined positions for historical patients and a rating of the selected intraocular lens for the historical patients. In a variation thereof, the determined positions includes a distance between an anterior capsule of the eye and an posterior capsule of the eye and a position of suspensory ligaments of the eye relative to one of the anterior capsule and the posterior capsule. In a refinement thereof, the database also includes a measure of the final position of a replacement lens of the historical patients and the step of identifying a first intraocular lens identifies the a first lens if the measure has a first value indicating the final position of the lens for a historical patient was as expected and a second lens if the measure has a second value indicating that the final position of the lens for the historical patient was different than expected, the second lens having a different optical power than the first lens.
In still another exemplary embodiment of the present disclosure, an imaging system for imaging at least a portion of an eye of a patient is provided. The system comprising a patient support adapted to position the eye of the patient; an illumination system including a plurality of light sources, each producing light to illuminate the eye; and an observation system including imaging optics configured to receive imaging rays produced by reflection of light from the eye. In one example, the observation system includes a plenoptic camera which receives the imaging rays from the imaging optics. In a variation thereof, the imaging system further comprises a storage device operatively coupled to the plenoptic camera to receive and store a plurality of images of the eye imaged by the plenoptic camera, each of the stored images having at least one associated component characteristic of one of the illumination system and the observation system. In another example, the illumination system further includes a slit forming device which receives illuminating light produced by the at least one light source and provides a line of light to illuminate the eye and wherein the plenoptic camera receives imaging rays produced by reflection of the line of light from the eye. In still another example, the plurality of light sources are arranged in an array. In a variation thereof, an illumination characteristic of a portion of the plurality of light sources is adjusted through an input device. In a refinement thereof, the illumination characteristic is one of an intensity level and a wavelength spectrum. In another variation, an illumination characteristic of a portion of the plurality of light sources is adjusted through an electronic controller. In a refinement thereof, the illumination characteristic is one of an intensity level and a wavelength spectrum.
In still another exemplary embodiment of the present disclosure, a method of analyzing an eye of a patient is provided. The method comprising the steps of illuminating the eye with an illumination system, the illumination system including a plurality of light sources; receiving with imaging optics imaging rays produced by reflection of light from the eye; directing the imaging rays to a camera to capture an image; displaying the image; and adjusting an illumination characteristic of a portion of the plurality of light sources to alter an illumination of a portion of the eye. In one example, the illumination characteristic is one of an intensity level and a wavelength spectrum. In another example, the illumination characteristic is adjusted to reduce glare at the portion of the eye.
While multiple embodiments are disclosed, still other embodiments of the present invention will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the invention. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
While the invention is amenable to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and are described in detail below. The intention, however, is not to limit the invention to the particular embodiments described. On the contrary, the invention is intended to cover all modifications, equivalents, and alternatives falling within the scope of the invention as defined by the appended claims.
Before any embodiments of the disclosure are explained in detail, it is to be understood that the disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more” and “at least one” may be used interchangeably herein. It is also to be noted that the terms “comprising”, “including”, and “having” may be used interchangeably.
The term “logic” or “control logic” as used herein may include software and/or firmware executing on one or more programmable processors, application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), digital signal processors (DSPs), hardwired logic, or combinations thereof. Therefore, in accordance with the embodiments, various logic may be implemented in any appropriate fashion and would remain in accordance with the embodiments herein disclosed.
Referring to
In one embodiment, examination system 100 includes a secondary diffuse illumination source 114 which illuminates portions of the eye not illuminated by the brighter illumination source of illumination system 102. The illumination source 114 may be any light source which provides a generally constant light intensity across a large portion of the eye 10. In one example, secondary diffuse illumination source 114 is supported by illumination system 102. In one example, secondary diffuse illumination source 114 is separate from illumination system 102.
Observation system 104 includes a plenoptic camera 130. Plenoptic camera 130 records light field data associated with the light reflected from eye 10. The light field data permits refocusing of an image recorded by the plenoptic camera 130. Plenoptic camera 130 is operatively coupled to a controller 300. As explained herein, controller 300 stores the images recorded by plenoptic camera 130 and processes image requests. Exemplary plenoptic cameras are the Lytro Illium brand camera available from Lytro, Inc. located at 1300 Terra Bella Avenue in Mountain View, Calif. 94043 and the R5, R11, R29, and RX camera models sold by Raytrix GmbH located at Schauenburgerstrasse 116 D-24118 in Kiel, Germany. Further exemplary plenoptic cameras and/or systems for processing images recorded by plenoptic cameras are disclosed in U.S. Pat. Nos. 7,706,632; 7,936,392; 7,956,924; 8,228,417; 8,238,738; 8,289,440; 8,471,897; 8,593,564; 8,619,177; US20130010260; US20130222633; US20140078259; US20140129988; US20140016019; US20140013273; US20130235267; US20130222652; US20130222606; US20130113981; US20130033636; US20120327222; US20120294590; US20120249550; US20110234841, the disclosures of which are expressly incorporated by reference herein.
Referring to
Referring to
An additional exemplary plenoptic camera 130 is disclosed in MANAKOV, Alkhazur et al., A Reconfigurable Camera Add-On for High Dynamic Range, Multispectral, Polarization, and Light-Field Imaging, ACM Transactions on Graphics, Association for Computing Machinery, 2013, Proceeding of SIGGRAPH, 32 (4), pp. 47:1-47-14, the disclosure of which is expressly incorporated by reference herein, wherein an apparatus is added between the imaging plane of a main lens group of a camera and the imaging sensor of the camera. The apparatus includes a pupil matching lens located at the image plane of the main lens group of the camera. The apparatus further includes a kaleidoscope-like arrangement of mirrors which creates multiple views of the image passing through the pupil matching lens, each with a different perspective shift. The multiple images are then cast to the imaging sensor of the camera.
Referring to
Referring to
Referring to
Illumination system 102 and observation system 104 are both moveable relative to moveable base 140 in a y-axis in direction 154 and direction 156 as illustrated in
Although illumination system 102 and observation system 104 are shown being rotatable about a vertical axis, axis 162, one or both of illumination system 102 and observation system 104 may be rotatable about a horizontal axis parallel to the x-axis or another axis in a plane defined by the x-axis and the y-axis. In one embodiment, each of illumination system 102 and observation system 104 is rotatable about a separate axis relative to moveable base 140.
Referring to
Moveable base 208 supports an illumination system 220 and an observation system 222. Illumination system 220 is moveable relative to moveable base 208 in the translation and rotation directions discussed in connection with
Referring to
Illumination system 220 further includes a slit 230 for allowing only a part of the light passing through the condenser lenses 226 and 228 to pass through the slit 230 and out of illumination system 220. The light passing through slit 230 provides a narrow generally rectilinear beam of light 236 (see
In one embodiment, illumination system 220 includes a filter 238 which limits the color of light that progresses through illumination system 220 and is ultimately used to illuminate eye 10. An exemplary filter would be cobalt blue to view fluorescein staining. Other exemplary filters may be used.
Slit 230 has an adjustable width to vary the width of the generally rectilinear beam of light which impinges upon eye 10 of the patient. In one embodiment, a width of slit 230 may be increased to provide generally full illumination of eye 10 of the patient. Exemplary widths for slit 230 are 1 mm and a thin slit having a width of up to about 1 mm. Further exemplary slit widths are in the range of about 0.2 mm to about 1.0 mm. In one embodiment, slit 230 is controlled through a knob or dial provided on illumination system 220. In one embodiment, slit 230 is automatically controlled through a computing system. An exemplary system for adjusting a width of slit 230 is provided in European Patent Application No. EP2695572, the disclosure of which is expressly incorporated by reference herein.
Illumination system 220 further includes a condenser lens 232 for converging the light that has passed through the slit 230 onto the eye 10 of the patient. The above-described slit 230 and the eye 10 to be examined are located in a conjugative position relative to the condenser lens 232 so that a local illumination ray of the slit 230 is projected to, for example, the cornea of the eye 10 to be examined. Light from slit 230 reaches eye 10 through a reflection from half-mirror 240. The light reflected from eye 10 is returned towards half-mirror 240 and passes through half-mirror 240 to reach observation system 222.
In one embodiment, illumination system 220 includes a collimator system which focuses the light from the source and then uses a collimator lens to produce a collimated beam of light emitting from light source 224. A portion of the collimated beam passes through slit 230 and is incident upon eye 10 of the patient. In one embodiment the light source is a white light source. In one embodiment the collimated beam is filtered to limit the color of the light that progresses through the illumination system and ultimately to illuminate eye 10.
In one embodiment, illumination system 220 includes light source 600 described in further detail herein with regard to
Observation system 222 includes an objective lens 250, a zooming optical system 252, a condenser lens 254, a beam splitter 256, a relay lens 258, a prism 260 for changing the optical path on the side of the housing of observation system 222 and an ocular lens 262. The image of the eye 10 is formed on an imaging point 264 and may be observed by the eye 266 of the person conducting the eye exam. The zooming optical system 252 changes a magnification of the image of eye 10.
Beamsplitter 256 also directs a portion of the light entering observation system 222 to a condenser lens 270 which directs the light into a plenoptic camera 130 through a reflection from a mirror 272. In one embodiment, plenoptic camera 130 is a still image camera. In one embodiment, plenoptic camera 130 is a video camera. In both embodiments, plenoptic camera 130 is used to capture a plurality of images of the eye 10 for subsequent examination as discussed herein. Plenoptic camera 130 captures both the position and direction of light propagating in space.
Referring to
Referring to
Referring to
Ophthalmologists and optometrists typically examine the eye 10 by first horizontally scanning across the eye using various slit beam thicknesses and orientations to examine the most anterior structures such as the cornea and conjunctiva.
Referring to
Referring to
Returning to
Further, moveable base 208 may support an illumination system rotary sensor 314 and an observation system rotary sensor 316. Illumination system rotary sensor 314 monitors a rotation of illumination system 220 relative to moveable base 208. Observation system rotary sensor 316 monitors a rotation of observation system 222 relative to moveable base 208. Exemplary sensors include optical sensors, mechanical sensors, electrical sensors, and combinations thereof.
Slit-lamp microscope 200 further includes a slit sensor 318, a filter sensor 320, a diffuse light illumination sensor 321, and an illumination sensor 322. Slit sensor 318 provides an indication of a slit width setting of slit 230. An exemplary system for monitoring a slit width is disclosed in European Patent Application No. EP2695572, the disclosure of which is expressly incorporated by reference herein. Filter sensor 320 provides an indication of whether a filter is placed in the light beam of illumination system 220. In one embodiment, a filter wheel is provided and an angular position of the filter wheel is monitored. Diffuse light illumination sensor provides an indication of the background illumination power level of a diffuse light source 114 (see
In one embodiment, one or more of moveable base 208, illumination system 220, observation system 222, slit 230, filter 238, light source 224, zooming optical system 252, and other settings of slit-lamp microscope 200 are set through manual inputs. In one embodiment, one or more of moveable base 208, illumination system 220, observation system 222, slit 230, filter 238, light source 224, zooming optical system 252, and other settings of slit-lamp microscope 200 are set by controller 300 controlling motors or other actuators.
Referring to
Controller 300 includes one or more input devices 360 to receive input from an operator of slit-lamp microscope 200. Exemplary input devices include keys, buttons, joysticks, touch screens, dials, switches, mouse, and trackballs which providing user control of slit-lamp microscope 200. Controller 300 further includes one or more output devices 362 to provide feedback or information to an operator. Exemplary output devices include a display, lights, and/or audio devices which provide user feedback or information.
In one embodiment, the information stored in memory 430 is made available to additional controllers, illustratively controller 400, over a network 402. In one embodiment, the logic of controller 300 is also made available to controller 400 over network 402. An exemplary output device 362 of controller 300 is a network access device which is capable of accessing network 402. An exemplary network access device is a modem.
Controller 400 includes input devices and output devices to receive input from an operator and to provide feedback or information to the operator, respectively. An exemplary operator for controller 400 is an ophthalmologist located remote from slit-lamp microscope 200. In one embodiment, controller 400 includes the logic described herein of controller 300 and retrieves images and related information over network 402 from controller 300. This arrangement allows an ophthalmologist to review examination data remotely from the slit-lamp microscope 200. In this manner an ophthalmologist is able to review a slit lamp exam remote from slit-lamp microscope 200. Further, since the images obtained during the initial examination or derived from the initial examination are stored on a memory of controller 400 or a memory accessible by controller 400 the ophthalmologist make review the slit lamp examination at a later time than the original examination.
As shown in
The plurality of images 410 and sensor data 412 is stored in memory 430. Memory 430 may include, but is not limited to, memory associated with the execution of software and memory associated with the storage of data. Memory 430 includes non-transitory computer readable media. Computer-readable media may be any available media that may be accessed by one or more processors of controller 300 and includes both volatile and non-volatile media. Further, computer readable-media may be one or both of removable and non-removable media. By way of example, computer-readable media may include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disk (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by controller 300.
In one embodiment, memory 430 also stores patient information 432 and examination information 434. Exemplary patient information includes a patient name or other identifier, patient medical history, and other suitable information. Exemplary examination information includes eye being examined, additional settings of slit-lamp microscope 200, and other suitable information. In one embodiment, controller 400 also includes or has access to image data 410 and sensor data 412 along with the logic of controller 300. As such, the discussions herein related to controller 300 apply equally to a remotely located controller, such as controller 400.
Referring to
An exemplary representation of Image Set 1 is provided in
The slit lamp microscope apparatus 200 described in this application permits a technician with basic skills to obtain the light-field data needed to recreate a slit-lamp examination at a later time and in a different location. The light-field camera captures an image at a given slit-beam size and angular orientation. A motorized apparatus moves the slit-beam along a horizontal axis to an adjacent or an overlapping position where another high resolution light-field image would be obtained. This process is repeated multiple times to scan across the structures of the eye. This scanning process can be repeated with slit-beams of various widths and angles of incidence to simulate the types of views obtained by an ophthalmologist using a traditional slit-lamp. This scanning allows for libraries of adjacent light-field slit images to be created for the various slit-beam widths and angles.
Retroillumination and specular reflection views are also possible through the careful placement of the illumination source and the viewing angle of the plenoptic camera 130. Non-slit illumination such as a ring light or point-source of light can be utilized in a similar manner (especially for retroillumination through a pupil). During light-field data acquisition, images are evaluated in real time to discard errant images, for example those associated with patient blinking, glare or patient movement. One embodiment of the apparatus includes soft arms that contact the upper and/or lower lids to allow for blink-free imaging. Stabilization algorithms that use landmarks of the eye and other stabilization techniques may be used to improve both image quality and the ability to collate adjacent images for later display. In one embodiment, images are captured with illumination system 220 positioned at −45° from straight on center (see
In one embodiment of the apparatus, the images obtained by the light-field camera are analyzed in real-time to automatically place the focus of the slit beam at various clinically important anatomic structures of the eye. These can include the tear film, anterior cornea, posterior cornea, the anterior chamber midpoint, anterior lens capsule, central lens, posterior lens capsule. Although these focal planes can be retrospectively viewed with light-field processing, thin slit-beam illumination may not be simultaneously focused at each of these layers (unless collimated light is used).
Other embodiments of the apparatus allow for variable angles of examination. The typical slit-lamp sequence is performed with vertically oriented slit-beams and horizontal movement of the viewing oculars, but the orientation of the examination could be rotated 90 degrees (horizontal slit/vertical scanning) or to any oblique angle. Various combinations of slit-beam focal plane, slit size and angular orientation imaging can be pre-chosen via the apparatus software to balance the ophthalmic completeness of the examination and the computational demands required to recreate various slit-beam views.
Referring to
In one example, a user through input devices 360 (or the respective input devices of controller 400) requests a specific image or image set to be displayed. For instance, a user may want to first walk through the examination as it was taken. Thus, the user may request the first image of Image Set 1. In this case image selection logic 460 would return the first image of Image Set 1.
In another example, the user through input devices 360 (or the respective input devices of controller 400) requests the image closest to a given characteristic of slit-lamp microscope 200. For instance, the user may want an image at the same x,y,z positioning of slit-lamp microscope 200, but with a narrower slit width. In this case image selection logic 460 would search the sensor data 412 stored in memory 430 to determine which image has the closest x,y,z, position and a narrower slit width.
In a further example, the user requests an image offset from the current image in one of x,y,z or the rotational angle of illumination system 220 or observation system 222. For instance, the user may want an image at the same x,y positioning of slit-lamp microscope 200, but focused deeper into the eye along the z-axis. Referring to
In a still further example, the user may request that multiple images be combined into a focal stack image wherein the in-focus portions of multiple image are combined or otherwise displayed together to generate an image having multiple depths along the z-axis in focus. In one example, the user may want to combine portions of multiple images, either taken during the examination or generated from the light field data, together to generate a focused image of a curved structure of the eye 10 which extends along the z-axis, such as the cornea 12 of the eye 10.
In yet still a further example, the user may want to selectively focus the z-axis on a clinically important structure such as the anterior cornea 12, so that x or y axis movements would follow the curved anatomy of the cornea. Referring to
Referring to
Optical microscope further includes an observation system 540 including a first observation unit 510 and a second observation unit 530. First observation unit 510 includes imaging optics 512 configured to receive imaging rays produced by reflection of light from the object of interest 502. The imaging optics 512 provide an image of a desired object plane 550 of the object of interest. First observation unit 510 further includes a viewfinder 514 through which an operator may view the image formed by optics 512. The light travels through a beam splitter 520 to reach viewfinder 514.
As is known in the art, a spacing or other characteristic of optics 512 may be altered to offset the focus of the imaging optics 512 from the desired object plane to an offset object plane 552. This is done to allow the operator of the first observation unit 510 to take into account the optical power of the viewfinder and/or the optical power of the operator's eyes. Thus, the image formed by imaging optics 512 alone will not be of the desired object plane 550, but rather an offset plane 552 from the first object plane to take into account the optical power of the viewfinder 514 and/or operator's eyes. In
Second observation unit 530 shares the imaging optics 512 and beam splitter 520 with first observation unit 510. Second observation system 530 further includes a plenoptic camera 130 which is coupled to a controller 300. Controller 300 displays an image captured by plenoptic camera 130 on a display 532.
A person viewing the image displayed with display 532 may not be satisfied with the focus of the image because it is not focused at the desired object plane 550. As stated earlier, the operator of first observation system 510 has set the characteristics of imaging optics 512 to provide the desired image through view finder 514. This may result in a fuzzy image being displayed with display 532. Through input devices 360 a person viewing the image displayed with display 532 can utilize the light field data recorded by plenoptic camera 130 to provide a refocused image on display 532 which is focused at the desired object plane 550.
In one embodiment, controller 300 includes processing sequences to monitor one or more portions of eye 10 over time. Controller 300 based on the received images determines whether a position of a structure of the eye 10 has changed over time. In one example, controller 300 monitors posterior capsule 30 of eye 10 to determine whether it has moved forward towards the anterior portion of eye 10. This type of movement is important to note when performing surgery on eye 10, such as providing a replacement lens 18 for eye 10. During surgery, an opening is provided in the anterior capsule 31 of eye 10 and the removal of lens 18 is aided with an ultrasonic probe. The posterior capsule 30 may move forward during or subsequent to this process. If the probe contacts the posterior capsule 30, the posterior capsule 30 may be punctured.
Referring to
Controller 300 determines if the one or more monitored structures are moving towards an unsafe location, as represented by block 906. In the case of the posterior capsule 30, controller 300 determines whether the posterior capsule 30 is moving forward towards the anterior side of the eye 10. In one example, controller 300 determines whether the movement of the monitored structure has exceeded a threshold amount. If not, the controller 300 continues to monitor the position of the one or more monitored structures of the eye 10. If so, controller 300 provides feedback to the operator of the movement of the one or more monitored structures towards an unsafe location, as represented by block 908. Exemplary types of the feedback include one or more of audio, visual, and tactile outputs. Controller 300 may further provide an input to an instrument contacting the eye to inhibit further operation of the instrument, as represented by block 910. In the case of lens removal, the instrument may be an ultrasonic probe and controller 300 may inhibit further operation of the probe based on the location or movement of the posterior capsule 30.
Referring to
Based on the determined positions, controller 300 suggests a first intraocular lens from a library of intraocular lens, as represented by block 954. In one embodiment, the first intraocular lens is selected from the library of intraocular lens through a comparison of the determined positions to a database of determined positions for historical patients and a rating of the selected intraocular lens for those respective historical patients.
In one example, after the original lens 18 is removed, the space between the anterior capsule 31 and the posterior capsule 30 is filled with a fluid. Controller 300 then determines a distance between the anterior capsule 31 and the posterior capsule 30. As is known in the art, this distance may be used to select the appropriate replacement lens 18 for insertion into the eye. Controller further determines the position of the suspensory ligaments relative to one of the anterior capsule 31 and posterior capsule 30. Controller 300 then searches a database for empirical data of historical patients having similar separations of the anterior capsule 31 and posterior capsule 30 and similar offsets for the suspensory ligaments 34. The database also includes a measure of the final position of lens 18 after healing for those historical patients. If the final position of lens 18 was as expected then controller 300 suggests a first lens 18. If the final position of lens 18 was different than expected, such as further posteriorly, then controller 300 may suggest a second lens having a different power than the first lens.
Returning the slit-lamp examples provided herein, in addition to standard light-field image processing, the apparatus employs software techniques to collate adjacent images for a specific slit-beam size and angular orientation. A library of adjacent images is created and stored through the techniques described above. This collection of images is analogous to the series of instantaneous slit-lamp images seen by an ophthalmologist scanning across the eye. Separate libraries of images can be created for the slit-views obtained at each slit-beam size and angular orientation. If various slit focal planes are used, separate libraries are created at each position. The images in these libraries can be cross-referenced to similar images in other slit focal planes. These cross-referenced images would be analogous to the images obtained by an ophthalmologist moving the slit-lamp joystick posteriorly so view the tear film, cornea, anterior chamber, iris, lens and vitreous. A different type of cross-referencing can create a library of images analogous to rotating the slit-beam about a pivot point.
These libraries of images allow the end-user to simulate the effect of a slit-lamp examination by using a trackpad, joystick, keyboard, touch-sensitive display screen or similar controller. Depending on the default settings chosen, a given slit image is projected on a display monitor. The user can manipulate the controller (joystick, trackpad, keyboard, touch-sensitive display screen) to simulate an x axis movement of the slit-lamp and call up adjacent x-axis images of the ocular structure of interest. Continued manipulation of the controller in the same direction would cause adjacent images to be displayed on the monitor to create a motion picture similar to the dynamic view obtained by an ophthalmologist using a slit-lamp.
Moving the controller in the y-axis would cause an upper or lower part of the captured image to be displayed. Moving the controller in z-axis would cause a different focal plane to come into focus. These z-axis movements could display a refocused light-field image—or in the case of a thin slit—a new light-field image of the same position but a posteriorly focused thin slit. In this manner, more anterior or posterior portions of the ocular structure would be visualized. Other controllers could call up images with thicker or thinner slit beams to simulate changing the slit thickness on a slit lamp. Likewise, other controllers could call up images with different slit beam orientations to simulate rotating the slit beam apparatus around its pivot point.
The previously described techniques of imaging use light-field photography to image a slit-beam as it illuminates various structures in the eye. In another embodiment of the apparatus, the light-field photography is performed without a slit-beam. Instead diffuse illumination is used, but during the viewing mode software selectively illuminates certain pixels so that a virtual slit effect is obtained. The end user can then use a mouse, joystick, keyboard, trackpad, touch-sensitive screen or similar controller to manipulate the virtual slit to simulate an entire slit-lamp exam. The advantage of this approach would be the elimination of the need for multiple slit-beam passes of the eye structures and the computing power necessary to perform the light-field photography reconstructions. Similarly, instead of illuminating certain pixels, another embodiment of the device uses bright diffuse illumination of the eye structures, and then software selectively dims the brightness of the majority of the image pixels, leaving only those pixels in a virtual slit configuration at the brightest level. Software can selectively create the inverse of this type of image (dimmed slit-beam in a brightly illuminated field) as this may allow for diagnostic views not possible in any conventional slit lamp examination.
The software portion of the apparatus allows for various playback and sharing settings. Comparison of a current examination to previous examinations can be made through side-by-side or overlay display. Slit lamp images can be made available to patients or other professionals either in raw form allowing the user to “drive through” the exam again, or a through a summary video created from the raw data.
One embodiment of the device adapts the plenoptic camera and logic systems described above to be used in conjunction with an operating microscope. This embodiment uses the light-field data and a processor to adjust the z-plane focus in real-time to either a user-defined plane or a plane chosen by an image recognition and tracking system locked on to pertinent eye anatomy such as the surgical limbus, conjunctival vessels or iris aperture. The x and y-axis can also be tracked using this system. Alternatively, the device allows for post-surgical adjustments of the z-axis focal plane and x- and y-axis orientation to allow for less fatiguing viewing of surgical video or for the post-processing of surgical video for educational dissemination.
One embodiment of the device uses a gonioscopic lens attachment to permit ophthalmologic viewing of the filtration angle structures of the eye using the slit-lamp, plenoptic camera and logic systems described above.
One embodiment of the device uses a fundus lens attachment similar to a Hruby lens, 78 diopter, 90 diopter or Volk Superfield lens to permit ophthalmologic viewing of the posterior vitreous and retina using the slit-lamp, plenoptic camera and logic systems described above.
One embodiment of the device uses a Goldmann tonometer attachment to the slit-lamp, plenoptic camera and logic systems described above to facilitate the measurement of the intraocular pressure in the eye.
One embodiment of the device optimizes the optics to examine the structures of the eye through the use of specular reflection. This embodiment allows for qualitative and quantitative evaluation of the corneal endothelium and includes the measurement of the endothelial cell count.
Other embodiments of the device combine the plenoptic imaging system with other established ocular imaging systems including but not limited to ocular coherence tomography, scanning laser ophthalmoscopy, and laser interferometry using the same or different patient support 210, the same or different controller 300, memory 430, processor(s) 450, input devices 360, output devises 362, and remote controller 400.
One embodiment of the device uses a Nd-YAG, argon, excimer, femtosecond or other laser in conjunction with the slit-lamp microscope, plenoptic camera and logic systems described above to treat various eye diseases and conditions either locally or remotely through a networked system.
One embodiment of the device attaches either a dropper system or a spray system to the slit lamp microscope to administer ocular pharmaceuticals such as anesthetics, dyes, dilating or constricting drops to aid in diagnosis or treatment of eye disease.
One embodiment of the device incorporates the controller 400 into an electronic medical records system so that the systems described above can be accessed and controlled from within a given patient's medical record. A still photo, video or sets of images or videos can be identified and separately stored in the electronic medical record file. These images or videos can also be printed or electronically to other providers or patients either from within the electronic record or from controllers 300 or 400.
Referring to
In one embodiment the slit lamp 200 illustrated in
In one example, the optical characteristics of light sources 602 are adjusted to increase visibility and minimize artifacts that appear in the images captured by plenoptic camera 130. For example, a glare region 610 is shown in the image of
By having individually controllable light sources 602, light source 600 is able to output customizable illumination patterns for illuminating eye 10. Referring to
Referring to
In one embodiment, a characteristic of an image captured by plenoptic camera 130 is altered by controller 300 without modification of a characteristic of the light source of examination system 100. In one example, plenoptic camera 130 is of the type illustrated in
Referring to
Referring to
Referring to
Referring to
Examination system 800 further includes two observation systems 820A and 820B. Each of the observation systems 820 includes imaging optics 812 configured to receive imaging rays produced by reflection of the light from the respective eyes 10 of the patient. The respective imaging optics 812 provide an image of a desired object plane 850 of the left and right eye. In particular, observation system 820A images right eye 10 and observation system 820B images left eye 10. The imaging rays passing through imaging optics 812 are provided to respective plenoptic cameras 130, which in turn provide images of the respective eye 10 of the patient to a controller 300. The images are displayed on an associated display 814 by controller 300 for observation by a user. The user may adjust the intrapupillary spacing between observation systems 820A and 820B through input device 818. In one embodiment, both observation system 820A and 820B are supported on a support, such as moveable base 208 of
Examination system 800 allows the user to obtain images of both the left and right eyes 10 of a patient and, subsequent to capturing images, to adjust the depth of focus from object plane 850 to an offset object plane 852 in order to view other structures of the eye. This allows the operator to independently change a depth of focus of both the left and right eye images and view various structures of the respective eyes.
In an exemplary embodiment of the present disclosure, an imaging system for imaging at least a portion of an eye of a patient is provided. The system comprising a patient support adapted to position the eye of the patient; a movable base moveable relative to the patient support; and an illumination system. The illumination system including at least one light source producing light to illuminate the eye and an illumination system support arm supporting the light source. The illumination system support arm being supported by the moveable base and rotatable relative to the moveable base. The system further comprising an observation system including a plenoptic camera configured to receive imaging rays produced by reflection of light from the eye, and an observation system support arm supporting the imaging system. The observation system support arm being supported by the moveable base and rotatable relative to the moveable base. The observation system further comprising a storage device operatively coupled to the plenoptic camera to receive and store a plurality of images of the eye imaged by the plenoptic camera, each of the stored images having at least one associated component characteristic of one of the patient support, the movable base, the illumination system, and the observation system. In one example, the illumination system further includes a slit forming device which receives illuminating light produced by the at least one light source and provides a line of light to illuminate the eye, the illumination system support arm supporting the slit forming device and wherein the plenoptic camera receives imaging rays produced by reflection of the line of light from the eye. In another example, the illumination system includes a plurality of light sources arranged in an array, the plurality of light sources each produce light to illuminate the eye. In a variation thereof, an illumination characteristic of a portion of the plurality of light sources is adjusted through an input device. In a refinement thereof, the illumination characteristic is one of an intensity level and a wavelength spectrum. In another variation thereof, an illumination characteristic of a portion of the plurality of light sources is adjusted through an electronic controller. In a refinement thereof, the illumination characteristic is one of an intensity level and a wavelength spectrum. In a further example, the observation system support arm is rotatable relative to the moveable base independent of the illumination system support arm. In yet a further example, the illumination system support arm is rotatable relative to the moveable base about a first rotation axis and the observation system support arm is rotatable relative to the moveable base about the first rotation axis.
In another exemplary embodiment, a method of analyzing an eye of a patient which has been illuminated with a slit-lamp microscope is provided. The slit-lamp microscope including an illumination system and an observation system. The illumination system including a light source and a slit forming device which provides a line of light to illuminate the eye and the observation system including an imaging system including a plenoptic camera configured to receive imaging rays produced by reflection of the line of light from the eye. The method comprising the steps of storing a plurality of images of the eye imaged by the plenoptic camera while the eye was illuminated with the line of light, each of the stored images having at least one associated slit-lamp microscope characteristic; receiving an image request; and providing a requested image based on at least one of the plurality of images, the image request, and the at least one associated slit-lamp microscope characteristic of the at least one of the plurality of images. In one example, the requested image includes the line of light focused on a first portion of a curved structure. In another example, the method further comprises the steps of receiving an image request for a second image having the line of light focused on a second portion of the curved structure, wherein the line of light is displaced in at least one of an x-axis direction and a y-axis direction and in a z-axis direction; and generating the second image from at least one of the stored images and the light field data of the at least one stored image. In a further example, the method further comprises the step of requesting to walk through the stored images sequentially. In yet a further example, the method further comprises the steps of retrieving an image set from a prior examination; and identifying an image from the prior examination having the same associated slit-lamp microscope characteristic as the requested image. In yet a further example, the associated slit-lamp microscope characteristic is one or more of an x-axis position of a moveable base of the slit-lamp supporting the illumination system and the observation system, a y-axis position of the moveable base, a z-axis position of the moveable base, a rotational position of the illumination system, a rotational position of the observation system, a slit width of the slit-forming device, and a magnification of the observation system. In still yet another example, the method further comprises the steps of receiving an image request for a second image having the line of light focused on at a different depth within the eye than the first image; and generating the second image from at least one of the stored images and the light field data of the at least one stored image.
In yet another exemplary embodiment of the present disclosure, an imaging system for imaging at least a portion of an eye of a patient is provided. The system comprising a patient support adapted to position the eye of the patient; an illumination system including a light source producing light to illuminate the eye; and an observation system including a plurality of cameras in a spaced apart arrangement, each camera positioned to receive imaging rays produced by reflection of light from the eye. In one example, each camera has an optical axis and the plurality of optical axes are parallel. In another example, the plurality of cameras are arranged along a line generally perpendicular to the optical axes of the plurality of cameras. In a further example, each camera has an optical axis and the plurality of optical axes converge towards a common point. In a variation thereof, the plurality of cameras are arranged along an arc. In a refinement thereof, the arc is a circular arc and the common point is a center of the circular arc. In still another example, the plurality of cameras are plenoptic cameras.
In a further exemplary embodiment of the present disclosure, a method of analyzing an eye of a patient is provided. The method comprising the steps of illuminating the eye with an illumination system, the illumination system including a light source and a slit forming device which provides a line of light to illuminate the eye; positioning a first camera relative to the eye to receive imaging rays produced by a reflection of the line of light from the eye; positioning a second camera relative to the eye to receive imaging rays produced by the reflection of the line of the light from the eye; and storing a plurality of images of the eye imaged by the first camera and the second camera while the eye was illuminated with the line of light. In one example, each of the first camera and the second camera have an optical axis which are parallel to each other. In a variation thereof, the first camera and the second camera are arranged along a line generally perpendicular to the optical axes of the first camera and the second camera. In another example, each of the first camera and the second camera have an optical axis that converge towards a common point. In another variation thereof, the first camera and the second camera are arranged along an arc. In a refinement thereof, the arc is a circular arc and the common point is a center of the circular arc. In a further refinement thereof, the plurality of cameras are plenoptic cameras.
In yet a further exemplary embodiment of the present disclosure, an imaging system for imaging at least a portion of an eye of a patient is provided. The system comprising a patient support adapted to position the eye of the patient; an illumination system including a light source producing light to illuminate the eye; and an observation system including imaging optics configured to receive imaging rays produced by reflection of light from the eye which are focused by the imaging optics at a first object plane, a first observation unit including a viewfinder which receives imaging rays from the imaging optics and a second observation unit which receives the imaging rays from the imaging optics, the second observation unit including a plenoptic camera and a display, the second observation unit displaying an image of the eye generated based on the imaging rays, the image of the eye being focused at a second object plane spaced apart from the first object plane. In one example, the imaging system further comprises a beamsplitter, the imaging rays reaching the viewfinder through a first path through the beamsplitter and reaching the plenoptic camera through a second path through the beamsplitter. In another example, the first object plane is offset from the second object plane. In a further example, the illumination system includes a plurality of light sources arranged in an array, the plurality of light sources each produce light to illuminate the eye. In a variation thereof, an illumination characteristic of a portion of the plurality of light sources is adjusted through an input device. In a refinement thereof, the illumination characteristic is one of an intensity level and a wavelength spectrum.
In yet still another exemplary embodiment of the present disclosure, a method of analyzing an eye of a patient is provided. The method comprising the steps of illuminating the eye with an illumination system; receiving with imaging optics imaging rays produced by reflection of light from the eye; directing the imaging rays to a viewfinder; directing the imaging ray to a plenoptic camera; focusing the imaging optics on a first object plane in the eye; and displaying on a display operatively coupled to the plenoptic camera a second object plane in the eye. In one example, the first object plane is offset from the second object plane. In a variation thereof, the first object plane take into account at least one of an optical power of the viewfinder and the optical power of an operator's eyes such that the resultant image viewed by the operator through the viewfinder is focused at the second object plane.
In still a further exemplary embodiment of the present disclosure, an imaging system for imaging at least a portion of a left eye of a patient and at least a portion of a right eye of the patient is provided. The system comprising a patient support adapted to position the left eye and the right eye of the patient; at least one illumination system including at least one light source producing light to illuminate the left eye and the right eye; a first observation system including a first plenoptic camera configured to receive imaging rays produced by reflection of light from the left eye; a second observation system including a second plenoptic camera configured to receive imaging rays produced by reflection of light from the right eye; and a storage device operatively coupled to the first plenoptic camera and to the second plenoptic camera to receive and store a plurality of images of the eye imaged by the first plenoptic camera and the second plenoptic camera. In one example, the at least one illumination system includes a first illumination system including at least a first light source producing light to illuminate the left eye and a second illumination system including at least a second light source producing light to illuminate the right eye.
In a further exemplary embodiment of the present disclosure, a method of analyzing an eye of a patient with an imaging system including an illumination system and an observation system is provided. The illumination system includes a light source. The observation system including an imaging system including a camera configured to receive imaging rays produced by reflection of light from the eye. The method comprising the steps of capturing images of a portion of the eye over time with the camera; monitoring a position of a structure of the eye in the captured images; determining if the structure of the eye is moving towards an unsafe location; and if the structure is moving towards an unsafe location, providing feedback of such movement. In one example, the method further comprises the step of providing a signal to inhibit operation of an instrument which is used to alter a portion of the eye. In a variation thereof, the instrument is an ultrasound probe. In another example, the step providing feedback of such movement includes at least one of providing an audio output, providing a visual output, and providing a tactile output. In a further example, the camera is a plenoptic camera. In a variation thereof, the structure is a posterior capsule of the eye and the step of determining if the structure of the eye is moving towards the unsafe location includes the step of determining if the posterior capsule is moving forward towards the anterior side of the eye. In a refinement thereof, the step of determining if the structure of the eye is moving towards the unsafe location includes the step of determining whether the movement of the structure has exceeded a threshold amount. In yet a further example, the step of determining if the structure of the eye is moving towards the unsafe location includes the step of determining whether the movement of the structure has exceeded a threshold amount.
In a yet further exemplary embodiment of the present disclosure, a method of analyzing an eye of a patient with an imaging system including an illumination system and an observation system is provided. The illumination system includes a light source. The observation system including a camera configured to receive imaging rays produced by reflection of light from the eye. The method comprising the steps of capturing images of a portion of the eye over time with a plenoptic camera; determining positions of one of more structures of the eye from the captured images; and identifying a first intraocular lens from a library of intraocular lenses for placement in the eye based on the determined positions. In one example, the step of identifying the first intraocular lens from the library of intraocular lenses for placement in the eye based on the determined positions includes the step of comparing the determined positions of the one or more structures of the eye with a database of determined positions for historical patients and a rating of the selected intraocular lens for the historical patients. In a variation thereof, the determined positions includes a distance between an anterior capsule of the eye and an posterior capsule of the eye and a position of suspensory ligaments of the eye relative to one of the anterior capsule and the posterior capsule. In a refinement thereof, the database also includes a measure of the final position of a replacement lens of the historical patients and the step of identifying a first intraocular lens identifies the a first lens if the measure has a first value indicating the final position of the lens for a historical patient was as expected and a second lens if the measure has a second value indicating that the final position of the lens for the historical patient was different than expected, the second lens having a different optical power than the first lens.
In still another exemplary embodiment of the present disclosure, an imaging system for imaging at least a portion of an eye of a patient is provided. The system comprising a patient support adapted to position the eye of the patient; an illumination system including a plurality of light sources, each producing light to illuminate the eye; and an observation system including imaging optics configured to receive imaging rays produced by reflection of light from the eye. In one example, the observation system includes a plenoptic camera which receives the imaging rays from the imaging optics. In a variation thereof, the imaging system further comprises a storage device operatively coupled to the plenoptic camera to receive and store a plurality of images of the eye imaged by the plenoptic camera, each of the stored images having at least one associated component characteristic of one of the illumination system and the observation system. In another example, the illumination system further includes a slit forming device which receives illuminating light produced by the at least one light source and provides a line of light to illuminate the eye and wherein the plenoptic camera receives imaging rays produced by reflection of the line of light from the eye. In still another example, the plurality of light sources are arranged in an array. In a variation thereof, an illumination characteristic of a portion of the plurality of light sources is adjusted through an input device. In a refinement thereof, the illumination characteristic is one of an intensity level and a wavelength spectrum. In another variation, an illumination characteristic of a portion of the plurality of light sources is adjusted through an electronic controller. In a refinement thereof, the illumination characteristic is one of an intensity level and a wavelength spectrum.
In still another exemplary embodiment of the present disclosure, a method of analyzing an eye of a patient is provided. The method comprising the steps of illuminating the eye with an illumination system, the illumination system including a plurality of light sources; receiving with imaging optics imaging rays produced by reflection of light from the eye; directing the imaging rays to a camera to capture an image; displaying the image; and adjusting an illumination characteristic of a portion of the plurality of light sources to alter an illumination of a portion of the eye. In one example, the illumination characteristic is one of an intensity level and a wavelength spectrum. In another example, the illumination characteristic is adjusted to reduce glare at the portion of the eye.
Various modifications and additions can be made to the exemplary embodiments discussed without departing from the scope of the present invention. For example, while the embodiments described above refer to particular features, the scope of this invention also includes embodiments having different combinations of features and embodiments that do not include all of the described features. Accordingly, the scope of the present invention is intended to embrace all such alternatives, modifications, and variations as fall within the scope of the claims, together with all equivalents thereof.
This application is a continuation application of U.S. patent application Ser. No. 16/905,408, filed Jun. 18, 2020, titled METHODS FOR ANALYZING THE EYE, which is a continuation of U.S. patent application Ser. No. 16/109,593, now U.S. Pat. No. 10,687,703, filed Aug. 22, 2018, titled SYSTEMS AND METHODS FOR ANALYZING THE EYE which is a divisional application of U.S. patent application Ser. No. 15/438,480, now U.S. Pat. No. 10,092,183, filed Feb. 21, 2017, titled SYSTEMS AND METHODS FOR ANALYZING THE EYE, which is a continuation-in-part of PCT Application Serial No. PCT/US2015/047747, filed Aug. 31, 2015, titled SYSTEMS AND METHODS FOR ANALYZING THE EYE, which claims the benefit of U.S. Provisional Application 62/044,253, filed Aug. 31, 2014, titled SYSTEMS AND METHODS FOR ANALYZING THE EYE, the entire disclosures of which are expressly incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
725567 | Ives | Apr 1903 | A |
2039648 | Ives | May 1936 | A |
2427689 | Harold et al. | Sep 1947 | A |
3948585 | Heine et al. | Apr 1976 | A |
3971065 | Bayer | Jul 1976 | A |
3985419 | Matsumoto et al. | Oct 1976 | A |
4099051 | Gugliotta | Jul 1978 | A |
4180313 | Inuiya | Dec 1979 | A |
4193093 | St. Clair | Mar 1980 | A |
4230942 | Stauffer | Oct 1980 | A |
4370033 | Kani et al. | Jan 1983 | A |
4383170 | Takagi et al. | May 1983 | A |
4422736 | Nunokawa | Dec 1983 | A |
4477159 | Mizuno et al. | Oct 1984 | A |
4580219 | Pelc et al. | Apr 1986 | A |
4642678 | Cok | Feb 1987 | A |
4661986 | Adelson | Apr 1987 | A |
4694185 | Weiss | Sep 1987 | A |
4715704 | Biber et al. | Dec 1987 | A |
4732453 | De et al. | Mar 1988 | A |
4774574 | Daly et al. | Sep 1988 | A |
4812643 | Talbot | Mar 1989 | A |
4838678 | Hubertus | Jun 1989 | A |
4844617 | Kelderman et al. | Jul 1989 | A |
4849782 | Koyama et al. | Jul 1989 | A |
4920419 | Easterly | Apr 1990 | A |
5000563 | Gisel et al. | Mar 1991 | A |
5076687 | Adelson | Dec 1991 | A |
5099354 | Lichtman et al. | Mar 1992 | A |
5189511 | Parulski et al. | Feb 1993 | A |
5220360 | Verdooner et al. | Jun 1993 | A |
5248876 | Kerstens et al. | Sep 1993 | A |
5270747 | Kitajima et al. | Dec 1993 | A |
5282045 | Mimura et al. | Jan 1994 | A |
5321446 | Massig et al. | Jun 1994 | A |
5349398 | Koester | Sep 1994 | A |
5361127 | Daily | Nov 1994 | A |
5387952 | Byer | Feb 1995 | A |
5394205 | Ochiai et al. | Feb 1995 | A |
5400093 | Timmers | Mar 1995 | A |
5436679 | Ohtsuka et al. | Jul 1995 | A |
5446276 | Iyoda et al. | Aug 1995 | A |
5493335 | Parulski et al. | Feb 1996 | A |
5652621 | Adams et al. | Jul 1997 | A |
5659390 | Danko | Aug 1997 | A |
5659420 | Wakai et al. | Aug 1997 | A |
5668597 | Parulski et al. | Sep 1997 | A |
5717480 | Brooks et al. | Feb 1998 | A |
5729011 | Sekiguchi | Mar 1998 | A |
5748371 | Cathey et al. | May 1998 | A |
5757423 | Tanaka et al. | May 1998 | A |
5763871 | Ortyn et al. | Jun 1998 | A |
5793379 | Lapidous | Aug 1998 | A |
5883695 | Paul | Mar 1999 | A |
5912699 | Hayenga et al. | Jun 1999 | A |
5946077 | Nemirovskiy | Aug 1999 | A |
5949433 | Klotz | Sep 1999 | A |
5993001 | Bursell et al. | Nov 1999 | A |
6023523 | Cohen et al. | Feb 2000 | A |
6028606 | Kolb et al. | Feb 2000 | A |
6072623 | Kitajima | Jun 2000 | A |
6091075 | Shibata et al. | Jul 2000 | A |
6097394 | Levoy et al. | Aug 2000 | A |
6097541 | Davies et al. | Aug 2000 | A |
6137535 | Meyers | Oct 2000 | A |
6137937 | Okano et al. | Oct 2000 | A |
6192162 | Hamilton et al. | Feb 2001 | B1 |
6201619 | Neale et al. | Mar 2001 | B1 |
6201899 | Bergen | Mar 2001 | B1 |
6268846 | Georgiev | Jul 2001 | B1 |
6283596 | Yoshimura et al. | Sep 2001 | B1 |
6292218 | Parulski et al. | Sep 2001 | B1 |
6301416 | Okano et al. | Oct 2001 | B1 |
6320979 | Melen | Nov 2001 | B1 |
6339506 | Wakelin et al. | Jan 2002 | B1 |
6351269 | Georgiev | Feb 2002 | B1 |
6476805 | Shum et al. | Nov 2002 | B1 |
6483535 | Tamburrino et al. | Nov 2002 | B1 |
6538249 | Takane et al. | Mar 2003 | B1 |
6575575 | O'Brien et al. | Jun 2003 | B2 |
6577342 | Wester | Jun 2003 | B1 |
6580502 | Kuwabara | Jun 2003 | B1 |
6597859 | Lienhart et al. | Jul 2003 | B1 |
6711283 | Soenksen | Mar 2004 | B1 |
6715878 | Gobbi et al. | Apr 2004 | B1 |
6738533 | Shum et al. | May 2004 | B1 |
6838650 | Toh | Jan 2005 | B1 |
6842297 | Dowski, Jr. | Jan 2005 | B2 |
6875973 | Ortyn et al. | Apr 2005 | B2 |
6900841 | Mihara | May 2005 | B1 |
6927922 | George et al. | Aug 2005 | B2 |
6934056 | Gindele et al. | Aug 2005 | B2 |
7015418 | Cahill et al. | Mar 2006 | B2 |
7019671 | Kawai | Mar 2006 | B2 |
7034866 | Colmenarez et al. | Apr 2006 | B1 |
7054067 | Okano et al. | May 2006 | B2 |
7085062 | Hauschild | Aug 2006 | B2 |
7109459 | Kam et al. | Sep 2006 | B2 |
7118217 | Kardon et al. | Oct 2006 | B2 |
7156518 | Cornsweet et al. | Jan 2007 | B2 |
7336430 | George et al. | Feb 2008 | B2 |
7338167 | Zelvin et al. | Mar 2008 | B2 |
7377644 | Davis | May 2008 | B2 |
7425067 | Warden et al. | Sep 2008 | B2 |
7458683 | Chernyak | Dec 2008 | B2 |
7485834 | Gouch | Feb 2009 | B2 |
7542077 | Miki | Jun 2009 | B2 |
7549748 | Davis | Jun 2009 | B2 |
7620309 | Georgiev | Nov 2009 | B2 |
7623726 | Georgiev | Nov 2009 | B1 |
7706632 | Gouch | Apr 2010 | B2 |
7723662 | Levoy et al. | May 2010 | B2 |
7732744 | Utagawa | Jun 2010 | B2 |
7744219 | Davis | Jun 2010 | B2 |
7792423 | Raskar et al. | Sep 2010 | B2 |
7828436 | Goldstein | Nov 2010 | B2 |
7847837 | Dotsuna et al. | Dec 2010 | B2 |
7854510 | Verdooner et al. | Dec 2010 | B2 |
7880794 | Yamagata et al. | Feb 2011 | B2 |
7936392 | Ng et al. | May 2011 | B2 |
7949252 | Georgiev | May 2011 | B1 |
7956924 | Georgiev | Jun 2011 | B2 |
7965936 | Raskar et al. | Jun 2011 | B2 |
8019215 | Georgiev et al. | Sep 2011 | B2 |
8228417 | Georgiev et al. | Jul 2012 | B1 |
8238738 | Georgiev | Aug 2012 | B2 |
8289440 | Knight et al. | Oct 2012 | B2 |
8434869 | Davis | May 2013 | B2 |
8471897 | Rodriguez et al. | Jun 2013 | B2 |
8593564 | Border et al. | Nov 2013 | B2 |
8619177 | Perwass et al. | Dec 2013 | B2 |
10092183 | Berestka et al. | Oct 2018 | B2 |
10687703 | Berestka et al. | Jun 2020 | B2 |
20010012149 | Lin et al. | Aug 2001 | A1 |
20010050813 | Allio | Dec 2001 | A1 |
20020101567 | Sumiya | Aug 2002 | A1 |
20020140835 | Silverstein | Oct 2002 | A1 |
20020159030 | Frey et al. | Oct 2002 | A1 |
20030020883 | Hara | Jan 2003 | A1 |
20030067596 | Leonard | Apr 2003 | A1 |
20030103670 | Schoelkopf et al. | Jun 2003 | A1 |
20030117511 | Belz et al. | Jun 2003 | A1 |
20030156077 | Balogh | Aug 2003 | A1 |
20030160957 | Oldham et al. | Aug 2003 | A1 |
20030231255 | Szajewski et al. | Dec 2003 | A1 |
20040114176 | Bodin et al. | Jun 2004 | A1 |
20040114807 | Lelescu et al. | Jun 2004 | A1 |
20040223214 | Atkinson | Nov 2004 | A1 |
20040256538 | Olson et al. | Dec 2004 | A1 |
20040257360 | Sieckmann | Dec 2004 | A1 |
20050080602 | Snyder et al. | Apr 2005 | A1 |
20050088714 | Kremen | Apr 2005 | A1 |
20050122418 | Okita et al. | Jun 2005 | A1 |
20050286019 | Wiltberger et al. | Dec 2005 | A1 |
20050286800 | Gouch | Dec 2005 | A1 |
20060050229 | Farberov | Mar 2006 | A1 |
20060104542 | Blake et al. | May 2006 | A1 |
20060176566 | Boettiger et al. | Aug 2006 | A1 |
20060238847 | Gouch | Oct 2006 | A1 |
20070024931 | Compton et al. | Feb 2007 | A1 |
20070036462 | Crandall et al. | Feb 2007 | A1 |
20070046862 | Umebayashi et al. | Mar 2007 | A1 |
20070071316 | Kubo | Mar 2007 | A1 |
20070091197 | Okayama et al. | Apr 2007 | A1 |
20070147673 | Crandall | Jun 2007 | A1 |
20070230944 | Georgiev | Oct 2007 | A1 |
20070252074 | Ng et al. | Nov 2007 | A1 |
20070257772 | Marcelle et al. | Nov 2007 | A1 |
20080007626 | Wernersson | Jan 2008 | A1 |
20080018668 | Yamauchi | Jan 2008 | A1 |
20080044063 | Friedman et al. | Feb 2008 | A1 |
20080056549 | Hamill et al. | Mar 2008 | A1 |
20080107231 | Miyazaki et al. | May 2008 | A1 |
20080131019 | Ng | Jun 2008 | A1 |
20080152215 | Horie et al. | Jun 2008 | A1 |
20080165270 | Watanabe et al. | Jul 2008 | A1 |
20080166063 | Zeng | Jul 2008 | A1 |
20080180792 | Georgiev | Jul 2008 | A1 |
20080187305 | Raskar et al. | Aug 2008 | A1 |
20080193026 | Horie et al. | Aug 2008 | A1 |
20080218610 | Chapman et al. | Sep 2008 | A1 |
20080226274 | Spielberg | Sep 2008 | A1 |
20080247623 | Delso et al. | Oct 2008 | A1 |
20080266655 | Levoy et al. | Oct 2008 | A1 |
20080309813 | Watanabe | Dec 2008 | A1 |
20090027542 | Yamamoto et al. | Jan 2009 | A1 |
20090041381 | Georgiev et al. | Feb 2009 | A1 |
20090041448 | Georgiev et al. | Feb 2009 | A1 |
20090086304 | Yurlov et al. | Apr 2009 | A1 |
20090102956 | Georgiev | Apr 2009 | A1 |
20090128658 | Hayasaka et al. | May 2009 | A1 |
20090128669 | Ng et al. | May 2009 | A1 |
20090140131 | Utagawa | Jun 2009 | A1 |
20090185801 | Georgiev et al. | Jul 2009 | A1 |
20090200623 | Qian et al. | Aug 2009 | A1 |
20090225279 | Small | Sep 2009 | A1 |
20090268970 | Babacan et al. | Oct 2009 | A1 |
20090273843 | Raskar et al. | Nov 2009 | A1 |
20090295829 | Georgiev et al. | Dec 2009 | A1 |
20100026852 | Ng et al. | Feb 2010 | A1 |
20100085468 | Park et al. | Apr 2010 | A1 |
20100128145 | Pitts et al. | May 2010 | A1 |
20100129048 | Pitts et al. | May 2010 | A1 |
20100205388 | MacInnis | Aug 2010 | A1 |
20100265386 | Raskar et al. | Oct 2010 | A1 |
20110149239 | Neal et al. | Jun 2011 | A1 |
20110169994 | DiFrancesco et al. | Jul 2011 | A1 |
20110234841 | Akeley et al. | Sep 2011 | A1 |
20110234977 | Verdooner | Sep 2011 | A1 |
20110273609 | DiFrancesco et al. | Nov 2011 | A1 |
20110313294 | De et al. | Dec 2011 | A1 |
20120101371 | Verdooner | Apr 2012 | A1 |
20120249550 | Akeley et al. | Oct 2012 | A1 |
20120294590 | Pitts et al. | Nov 2012 | A1 |
20120327222 | Ng et al. | Dec 2012 | A1 |
20130010260 | Tumlinson et al. | Jan 2013 | A1 |
20130033636 | Pitts et al. | Feb 2013 | A1 |
20130113981 | Knight et al. | May 2013 | A1 |
20130169934 | Verdooner | Jul 2013 | A1 |
20130208241 | Lawson et al. | Aug 2013 | A1 |
20130222606 | Pitts et al. | Aug 2013 | A1 |
20130222633 | Knight et al. | Aug 2013 | A1 |
20130222652 | Akeley et al. | Aug 2013 | A1 |
20130235267 | Pitts et al. | Sep 2013 | A1 |
20130237970 | Summers et al. | Sep 2013 | A1 |
20130301003 | Wells et al. | Nov 2013 | A1 |
20140013273 | Ng | Jan 2014 | A1 |
20140016019 | Pitts et al. | Jan 2014 | A1 |
20140078259 | Hiramoto et al. | Mar 2014 | A1 |
20140129988 | Liang et al. | May 2014 | A1 |
20140139807 | Uchiyama | May 2014 | A1 |
20140218685 | Nakamura | Aug 2014 | A1 |
20140226128 | Lawson et al. | Aug 2014 | A1 |
20140300817 | Bezman et al. | Oct 2014 | A1 |
20150305614 | Narasimha-Iyer et al. | Oct 2015 | A1 |
20150347841 | Mears | Dec 2015 | A1 |
20160213249 | Cornsweet | Jul 2016 | A1 |
20170237974 | Samec et al. | Aug 2017 | A1 |
20190053703 | Berestka et al. | Feb 2019 | A1 |
20200315451 | Berestka et al. | Oct 2020 | A1 |
Number | Date | Country |
---|---|---|
102871643 | Jan 2013 | CN |
2695571 | Feb 2014 | EP |
2695572 | Feb 2014 | EP |
2013-527775 | Jul 2013 | JP |
2014-033812 | Feb 2014 | JP |
2013162471 | Oct 2013 | WO |
Entry |
---|
Communication under Rule 164(2)(a) EPC and Partial Supplementary Search Report, European Application No. 15763714.1, issued by the European Patent Office, dated Oct. 15, 2019, 5 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/047747, dated Mar. 9, 2017, 8 pages. |
International Search Report and Written Opinion of the International Searching Authority, PCT/US2015/047747, dated Jan. 20, 2016, 13 pages. |
Manakov, Alkhazur et al., A Reconfigurable Camera Add-On for High Dynamic Range, Multispectral, Polarization, and Light-Field Imaging, ACM Transactions on Graphics, Association for Computing Machinery, 2013, Proceeding of SIGGRAPH, 32 (4), pp. 47:1-47-14. |
Office Action, Japanese Patent Office, Japanese Patent Application No. 2017-531455, dated Mar. 24, 2020, 7 pages. |
Raskar, Ramesh et al., Glare Aware Photography: 4D Ray Sampling for Reducing Glare Effects of Camera Lenses, Mitsubishi Electric Research Laboratories, SIGGRAPH 2008, http://www.merl.com, 12 pages. |
Veeraraghavan, Ashok et al., Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing, Mitsubishi Electric Research Laboratories, Proc. ACM SIGGRAPH, Jul. 2007, http://www.merl.com, 12 pages. |
Wilburn, Bennett et al., High Performance Imaging Using Large Camera Arrays, ACM Transactions on Graphics (proceedings SIGGRAPH) vol. 24, No. 3., 2005, pp. 765-776. |
Number | Date | Country | |
---|---|---|---|
20220400950 A1 | Dec 2022 | US |
Number | Date | Country | |
---|---|---|---|
62044253 | Aug 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15438480 | Feb 2017 | US |
Child | 16109593 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16905408 | Jun 2020 | US |
Child | 17895382 | US | |
Parent | 16109593 | Aug 2018 | US |
Child | 16905408 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2015/047747 | Aug 2015 | US |
Child | 15438480 | US |