This disclosure relates generally to retinal imaging technologies, and in particular but not exclusively, relates to illumination techniques for retinal imaging.
Retinal imaging is a part of basic eye exams for screening, field diagnosis, and progress monitoring of many retinal diseases. A high-fidelity retinal image is important for accurate screening, diagnosis, and monitoring. Bright illumination of the posterior interior surface of the eye (i.e., retina) through the pupil improves image fidelity but often creates optical aberrations or image artifacts, such as corneal reflections, iris reflections, lens flare, haze, or pupillary shadows, if the retinal camera and illumination source are not appropriately aligned with the eye. Simply increasing the brightness of the illumination does not overcome these problems, but rather makes the optical artifacts more pronounced, which undermines the goal of improving image fidelity.
Accordingly, camera alignment is very important, particularly with conventional retinal cameras, which typically have a limited eyebox due to the need to block the deleterious image artifacts listed above. Referring to
A conventional retinal camera system (such as retinal camera 105) use a single eyebox 100 having a single location (defined relative to the eyepiece lens 110 of the camera system) from which both the left-side and right-side eyes are imaged. However, this single location is a compromise location that is not optimized for the individual eye and furthermore does not account for the need to obtain higher quality images in specific regions of interest within the left and/or right eyes to help the doctor screen, diagnose, monitor, or treat specific ophthalmic pathologies.
Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified. Not all instances of an element are necessarily labeled so as not to clutter the drawings where appropriate. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles being described.
Embodiments of an apparatus, system, and method of operation for a retinal imaging system that adapts the eyebox location based upon pathologies of interest (POI) and/or eye sidedness are described herein. In the following description numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
High fidelity retinal images are important for screening, diagnosing, and monitoring many retinal diseases. To this end, reducing or eliminating instances of image artifacts that occlude, or otherwise malign portions of the retinal image is desirable. This can be particularly true when specific regions of interest in a particular eye (e.g., right-sided eye or left-sided eye) need to be clearly imaged to screen for, diagnosis, monitor, or treat a specific ophthalmic pathology. Conventional retinal imaging systems use an eyebox in a fixed eyebox location that is not only fixed for a given eye, but also fixed across both the left-side and right-side eyes.
The desirability of dynamically selected eyebox location and/or illumination patterns is further highlighted by
The optical relay system serves to direct (e.g., pass or reflect) illumination light 480 output from illuminator 405 along an illumination path through the pupil of eye 470 to illuminate retina 475 while also directing image light 485 of retina 475 (i.e., the retinal image) along an imaging path to image sensor 410. Image light 485 is formed by the scattered reflection of illumination light 480 off of retina 475. In the illustrated embodiment, the optical relay system further includes beam splitter 450, which passes at least a portion of image light 485 to image sensor 410 while also optically coupling dynamic fixation target 425 to eyepiece lens assembly 435 and directing dynamic fixation image 427 output from display 226 to eye 470. Beam splitter 450 may be implemented as a polarized beam splitter, a non-polarized beam splitter (e.g., 90% transmissive and 10% reflective, 50/50 beam splitter, etc.), a multi-layer dichroic beam splitter, or otherwise. The optical relay system includes a number of lenses, such as lenses 435, 440, and 445, to focus the various light paths as needed. For example, lens 435 may include one or more lensing elements that collectively form an eyepiece lens assembly that is displaced from the cornea of eye 470 by an eye relief 495 during operation. Lens 440 may include one or more lens elements for bring image light 485 to a focus on image sensor 410. Lens 445 may include one or more lens elements for focusing dynamic fixation image 427. It should be appreciated that optical relay system may be implemented with a number and variety of optical elements (e.g., lenses, reflective surfaces, diffractive surfaces, etc.) and may vary from the configuration illustrated in
In one embodiment, dynamic fixation image 427 output from display 426 represents a point of fixation upon which the patient can accommodate their focus and fix their gaze. The dynamic fixation image 427 may be an image of a plus-sign, a bullseye, a cross, a target, circles, or other shape or collection of shapes (e.g., see
Controller 415 is coupled to image sensor 410, display 426, illuminator 405, and alignment tracking camera system 430 to orchestrate their operation. Controller 415 may include software/firmware logic executing on a microcontroller, hardware logic (e.g., application specific integrated circuit, field programmable gate array, etc.), or a combination of software and hardware logic. Although
Image sensor 410 may be implemented using a variety of imaging technologies, such as complementary metal-oxide-semiconductor (CMOS) image sensors, charged-coupled device (CCD) image sensors, or otherwise. In one embodiment, image sensor 410 includes an onboard memory buffer or attached memory to store/buffer retinal images. In one embodiment, image sensor 410 may include an integrated image signal processor (ISP) to permit highspeed digital processing of retinal images buffered in the onboard memory. The onboard image buffer and ISP may facilitate high frame rate image burst captures, image processing, image stacking, and output of high-quality composite retinal images. The integrated ISP may be considered a decentralized component of controller 415.
Alignment tracking camera system 430 operates to track lateral alignment (or misalignment) and relief offset between retinal imaging system 400 and eye 470, and in particular, between eyepiece lens assembly 435 and eye 470. System 430 may operate using a variety of different techniques to track the relative position of eye 470 to retinal imaging system 400 including pupil tracking or iris tracking. In the illustrated embodiment, system 430 includes two cameras disposed on either side of eyepiece lens assembly 435 to enable triangulation and obtain X, Y, and Z gross position information about the pupil or iris. In one embodiment, system 430 also includes one or more infrared (IR) emitters to track eye 470 with IR light while retinal images are acquired with bursts of visible spectrum light output through eyepiece lens assembly 435 from illuminator 405. In such an embodiment, IR filters may be positioned within the image path to filter the IR tracking light. In some embodiments, the tracking illumination is temporally offset from image acquisition with white light bursts.
Lateral eye alignment may be measured via retinal images acquired by image sensor 410, or separately/additionally, by system 430. In the illustrated embodiment, system 430 is positioned externally to view eye 470 from outside of eyepiece lens assembly 435. In other embodiments, system 430 may be optically coupled via the optical relay components to view and track eye 470 through eyepiece lens assembly 435.
Returning to
In a process block 605, the retinal imaging process is initiated. Initiation may include the user pressing a power button on user interface 420. After powering on, controller 415 obtains an indication of the POI related to the eye being examined. This indication may be solicited via user interface 420, or otherwise input by the user/operator of retinal imaging system 400. Example POIs may include diabetic retinopathy, glaucoma, or otherwise. Determination of the particularly POI enables controller 405 to configure the eyebox location and/or illumination patterns to best inspect the portion(s) of retina 475 that is/are most pertinent to the particular ophthalmic disease selected.
In a process block 615, illumination is enabled to obtain preliminary eye images to facilitate eye tracking and/or determine eye-sidedness. In one embodiment, this initial illumination is IR illumination output from alignment tracking camera system 430 and/or IR emitters of light sources 505. The IR illumination reduces the likelihood that the light will result in a physiological response that constricts the iris prior to acquiring the primary retinal images.
In a process block 620, the eye-sidedness (i.e., right-sided eye or a left-sided eye) is determined. Eye-sidedness may be manually input via user interface 420 or automatically determined by controller 415 based upon image analysis and feature identification performed on a preliminary image of the eye. The preliminary image may be an IR retinal image acquired via image sensor 410 and/or eye images acquired by alignment tracking camera system 430.
With one or both of eye-sidedness and POI determined, the eyebox location for retinal imaging system 400 may be selected (process block 625). The determination may be based upon either one or both of these factors. The eyebox location is the location of the eyebox of the imaging system, which is a bound region in space defined relative to the eyepiece lens assembly. As illustrated in
With the eyebox location selected, the fixation location of dynamic fixation target 425 may be configured to encourage eye 470 to adjust its position and/or gaze direction accordingly (process block 630).
As threshold alignment is achieved (decision block 640), illuminator 405 is configured by controller 415 to select the appropriate illumination pattern for retinal imaging. The illumination pattern may be selected based upon pupil location and pupil size to reduce image artifacts and optimize retinal image quality (process block 650). In one embodiment, a lookup table (LUT) may index illumination patterns to pupil position and/or pupil size. In yet other embodiments, the LUT may further index illumination patterns to POI and/or eye sidedness for further pattern refinement. For example, the illumination pattern may not only consider the current location of the eye relative to eyepiece lens assembly 435, but also the anatomical feature that is relevant to a given pathology and thus select an illumination pattern that shifts various image artifacts away from that anatomic feature in the retinal images. This may be considered a finer illumination pattern refinement in addition to the selection of the illumination pattern based upon real-time eye position tracking.
With threshold alignment achieved (decision block 640) and the appropriate illumination pattern selected (process block 650), illuminator 405 illuminates retina 475 through the pupil. This illumination may be a white light flash, though the particular wavelengths used for illumination (e.g., broadband white light, IR light, near-IR, etc.) may be tailored for a particular pathology or application. The illumination flash in process block 655 may only last for a period of time (e.g., 200 msec) that is less than or equal to the human physiological response time (e.g., pupil constriction or eye blink). While illumination is active, one or more retinal images are acquired (process block 660). In one embodiment, acquisition of a burst of retinal images (e.g., 5, 10, 20, 50, 100 images) is triggered during the illumination window and while the eye remains in the selected eyebox as determined from real-time feedback from alignment tracking camera system 430 (or image sensor 410).
The burst of retinal images may be buffered onboard a camera chip including image sensor 410 where an image signal processor (ISP) can quickly analyze the quality of the acquired retinal images. The ISP may be considered a component of controller 415 (e.g., decentralized offload compute engine) that is located close to image sensor 410 to enable highspeed image processing. If the images are occluded, obscured, or otherwise inadequate, then process 600 returns to process block 630 to repeat the relevant portions of process 600. However, if the acquired images are collectively deemed sufficient to adequately capture the region of interest relevant to the POI, then the retinal images may be combined (process block 670) to generate, output, and save a high quality composite image (process block 675). A variety of different combining techniques may be implemented such as image stacking or otherwise.
The processes explained above are described in terms of computer software and hardware. The techniques described may constitute machine-executable instructions embodied within a tangible or non-transitory machine (e.g., computer) readable storage medium, that when executed by a machine will cause the machine to perform the operations described. Additionally, the processes may be embodied within hardware, such as an application specific integrated circuit (“ASIC”) or otherwise.
A tangible machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a non-transitory form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.). For example, a machine-readable storage medium includes recordable/non-recordable media (e.g., read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
This application claims the benefit of U.S. Application No. 63/345,258, filed on May 24, 2022, the contents of which are incorporated herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2023/013737 | 2/23/2023 | WO |
Number | Date | Country | |
---|---|---|---|
63345258 | May 2022 | US |