The present disclosure relates to techniques for imaging and/or measuring a subject's eye, including the subject's retina fundus.
Techniques for imaging and/or measuring a subject's eye would benefit from improvement.
Some aspects of the present disclosure relate to an imaging and/or measuring device comprising an adjustable flexure having one or more lenses therein.
Some aspects of the present disclosure relate to an imaging and/or measuring device comprising an adjustable flexure configured to provide variable diopter compensation.
Some aspects of the present disclosure relate to a method comprising imaging and/or measuring a person's eye using an adjustable flexure within an imaging and/or measuring device, the adjustable flexure having one or more lenses therein.
Some aspects of the present disclosure relate to a method comprising providing variable diopter compensation for imaging and/or measuring a person's eye using an adjustable flexure.
The foregoing summary is not intended to be limiting. Moreover, various aspects of the present disclosure may be implemented alone or in combination with other aspects.
The accompanying drawings are not intended to be drawn to scale. In the drawings, each identical or nearly identical component that is illustrated in various figures is represented by a like numeral. For purposes of clarity, not every component may be labeled in every drawing. In the drawings:
The inventors have recognized and appreciated that a person's eyes provide a window into the body that may be used to not only to determine whether the person has an ocular disease, but to determine the general health of the person. The retina fundus in particular can provide valuable information via imaging for use in various health determinations. However, conventional systems of imaging and/or measuring the fundus only provide superficial information about the subject's eye and cannot provide sufficient information to diagnose certain diseases. Accordingly, in some embodiments, multiple modes of imaging and/or measuring are used to more fully image the fundus of a subject. For example, two or more techniques may be used to simultaneously image and/or measure the fundus. In some embodiments, some or each technique of optical imaging, fluorescence imaging, and optical coherence tomography may be used to provide multimodal imaging and/or measuring of the fundus. The inventors have recognized that by using multimodal imaging, as compared to conventional, unimodal imaging, a greater amount of information may be obtained about the fundus than that may be used to determine the health of the subject. In some embodiments, two or more of optical imaging, optical coherence tomography (OCT), fluorescence spectral imaging, and fluorescence lifetime imaging (FLI) may be used to provide multimodal images of the fundus. By way of example, a device that jointly uses color optical imaging, infrared (IR) imaging, OCT, autofluorescence spectral imaging, and FLI provides five modes of imaging the fundus.
The inventors have further recognized and appreciated that making the device portable, handheld, and affordable would have the greatest impact on global health. Countries or regions that cannot afford specialized facilities for diagnosing certain diseases and/or do not have the medical specialists to analyze data from imaging tests are often left behind to the detriment of the overall health of the population. A portable device that may be brought to any low-income community allowing greater access to important healthcare diagnostics. Accordingly, some embodiments are directed to an apparatus that includes multiple modes of imaging the fundus within a housing that is portable and, in some examples, handheld. In some embodiments, the apparatus has a binocular form factor such that a subject may hold the apparatus up to the eyes for fundus imaging. In some embodiments, one or more of the modes of imaging may share optical components to make the apparatus more compact, efficient, and cost effective. For example, a color optical imaging device and a fluorescence imaging device may be housed in a first half of the binocular housing of the apparatus and the OCT device may be housed in the second half of the binocular housing.
Using such an apparatus, both eyes of the subject may be imaged simultaneously using the different devices. For example, the subject's left eye may be imaged using the optical imaging device and/or the fluorescence imaging device while the subject's right eye is imaged using the OCT device. After the initial imaging is complete, the subject can reverse the orientation of the binocular apparatus such that each eye is then measured with the devices disposed in the other half of the binocular housing, e.g., the left eye is imaged using the OCT device and the right eye is imaged using the optical imaging device and/or the fluorescence imaging device. To ensure the apparatus can operate in both orientations, the front surface of the apparatus that is placed near the subject's eyes may be substantially symmetric. Additionally or alternatively, the two halves of the apparatus's housing may be connected by a hinge that allows the two halves to be adjusted to be either orientation.
As shown in
As shown in
Control panel 125 may be electrically coupled to electronics 120. For example, the scan buttons of control panel 125 may be configured to communicate an image capture and/or scan command to electronics 120 to initiate a scan using imaging device 122 and/or 123. As another example, the power button of control panel 125 may be configured to communicate a power on or power off command to electronics 120. As illustrated in
As shown in
In some embodiments, imaging apparatus described herein may be configured for mounting to a stand, as illustrated in the example of
As illustrated in
In some embodiments, holding portion 158 (or some other portion of stand 150) may include charging hardware configured to transmit power to imaging apparatus 100 through a wired or wireless connection. In one example, the charging hardware in stand 150 may include a power supply coupled to one or a plurality of wireless charging coils, and imaging apparatus 100 may include wireless charging coils configured to receive power from the coils in stand 150. In another example, charging hardware in stand 150 may be coupled to an electrical connector on an exterior facing side of holding portion 158 such that a complementary connector of imaging apparatus 100 interfaces with the connector of stand 150 when imaging apparatus 100 is seated in holding portion 158. In accordance with various embodiments, the wireless charging hardware may include one or more power converters (e.g., AC to DC, DC to DC, etc.) configured to provide an appropriate voltage and current to imaging apparatus 100 for charging. In some embodiments, stand 150 may house at least one rechargeable battery configured to provide the wired or wireless power to imaging apparatus 100. In some embodiments. Stand 150 may include one or more power connectors configured to receive power from a standard wall outlet, such as a single-phase wall outlet.
In some embodiments, front housing portion 105 may include multiple portions 105a and 105b. Portion 105a may be formed using a mechanically resilient material whereas front portion 105b may be formed using a mechanically compliant material, such that front housing portion 105 is comfortable for a user to wear. For example, in some embodiments, portion 105a may be formed using plastic and portion 105b may be formed using rubber or silicone. In other embodiments, front housing portion 105 may be formed using a single mechanically resilient or mechanically compliant material. In some embodiments, portion 105b may be disposed on an exterior side of front housing portion 105, and portion 105a may be disposed within portion 105b.
In some embodiments, white light components of white light and/or fluorescence components 202 may be configured to illuminate a subject's eye with white light (or a lesser portion of the spectrum of visible light) and receive reflected light from the subject's eye to capture an image of the subject's eye. In some embodiments, fluorescence components of white light and/or fluorescence components 202 may be configured to transmit, to a subject's eye, excitation light configured to excite luminescent molecules in the subject's eye (e.g., naturally luminescent molecules and/or a luminescent dye) and receive fluorescent light from the subject's eye to capture an image of the subject's eye. For example, the fluorescence components may include fluorescence lifetime imaging components, fluorescence intensity imaging components, fluorescence spectral imaging components, and/or a combination thereof.
In some embodiments, OCT components of OCT and/or IR components 204 may be configured to illuminate a subject's eye with light from a light source (e.g., a super-luminescent diode) and compare light reflected from the subject's eye with light reflected from a reference surface to capture an image (e.g., one or more depth scans) of the subject's eye. In some embodiments, IR components of OCT and/or IR components 204 may be configured to illuminate a subject's eye with IR light from an IR light source and receive IR light from the subject's eye to capture an image of the subject's eye.
It should be appreciated that, in some embodiments, white light and/or fluorescence components 202 may include only white light components or only fluorescence components. Similarly, in some embodiments, OCT and/or IR components may include only OCT components or only IR components. Moreover, according to various embodiments, some or each of white light, fluorescence, OCT, and/or IR components may be positioned on either side of an imaging and/or measuring apparatus, alone or in various combinations with one another.
Described herein are exemplary configurations of white light and fluorescence imaging and/or measuring components. Although the exemplary configurations illustrated herein include each of white light and fluorescence imaging and/or measuring components, it should be appreciated that white light and/or fluorescence imaging and/or measuring components described herein may be included alone or in combination with one another and/or with other modes of imaging and/or measuring devices.
In some embodiments, source components 310 may be configured to generate and provide light to sample components 320 for focusing on the subject's eye such that light reflected and/or fluorescence light emitted from the subject's eye may be captured using fluorescence detection components 340 and/or white light detection components 350. In
In some embodiments, LEDs 312 may include one or more fluorescence excitation LEDs, which may be configured to excite luminescent molecules of interest in the subject's eye. In some embodiments, LEDs 312 may be configured to generate excitation light having a wavelength between 460 nm and 500 nm, such as between 480 nm to 500 nm and/or 465 nm to 485 nm. In some embodiments, LEDs 312 may be configured to generate light having a bandwidth of 5-6 nm. In some embodiments, LEDs 312 may be configured to generate light having a bandwidth of 20-30 nm. It should be appreciated that some embodiments may include a plurality of lasers configured to generate light having different wavelengths.
As shown in
In
In some embodiments, sample components 320 may be configured to focus light from source components 310 and fixation light from fixation components 330 on the subject's eye and provide received light (e.g., reflected and/or emitted) from the subject's eye to fluorescence detection components 340 and/or white light detection components 350. In
In some embodiments, fluorescence dichroic 324 may be configured to transmit white light and/or excitation light and reflect fluorescence light such that white light from source components 310 may reach the subject's eye and reflected white light from the subject's eye may reach white light detection components 350, whereas fluorescence dichroic 324 may be configured to reflect fluorescent emissions from the subject's eye toward fluorescence detection components 340. For example, fluorescence dichroic 324 may be configured as a long pass filter. In some embodiments, fluorescence dichroic 324 may be configured to transmit at least some of the received fluorescent emissions to white light detection components 350 and/or reflect at least some of the reflected white light to fluorescence detection components 340. For example, fluorescence dichroic 324 may be configured as a beam splitter for at least some wavelengths of white light and/or fluorescence emissions. According to various embodiments, fluorescence dichroic 324 may have a transmission/reflection transition between 550 nm and 625 nm, such as at 550 nm, 575 nm, 600 nm, or 625 nm.
In some embodiments, fixation beam splitter 326 may be configured to transmit white light, excitation light, and/or fluorescent light and reflect fixation light from fixation components 330 towards the subject's eye, such that white light and excitation light from source components 310 may reach the subject's eye and white light and/or fluorescent light received from the subject's eye may reach white light detection components 350 and/or fluorescence detection components 340, respectively. In some embodiments, fixation beam splitter 326 may be configured as a long pass filter and/or as a beam splitter at least for wavelengths of fixation light. In some embodiments, fixation beam splitter 326 may be configured to transmit light toward a photodetector (PD) and through a PD lens, where the PD is configured to determine whether the amount of light to be transmitted toward the subject's eye exceeds a safety threshold.
In some embodiments, objective lenses 328 may be configured to focus light from source components on the subject's eye and focus light from the subject's eye toward the appropriate detection components. In some embodiments, objective lenses 328 may include a plurality of plano-concave (PCV), plano-convex (PCX), and biconcave lenses. For example, objective lenses 328 may include two opposite-facing PCX lenses with a PCV lens and a biconcave between the PCX lenses. In some embodiments, objective lenses 328 may include an achromatic doublet. For example, the achromatic doublet can include a biconvex (BCX) lens and a meniscus negative lens. In some embodiments, one or more of the lenses of objective lenses 318 can include an aspheric surface, which provides improved image sharpness. For example, the aspheric surface can be a rear surface of the achromatic doublet.
In some embodiments, fixation components 330 may be configured to transmit fixation light toward the subject's eye to display a visible fixation object. In
In some embodiments, fluorescence detection components 340 may be configured to receive fluorescent light from the subject's eye reflected via fluorescence dichroic 324. In
In some embodiments, fluorescence sensor 344 may be configured to capture fluorescent light to perform fluorescence imaging. For example, fluorescence sensor 344 may be an integrated device configured to perform fluorescence lifetime imaging, fluorescence spectral imaging (e.g., autofluorescence spectral imaging), and/or fluorescence intensity imaging. In the example of fluorescence lifetime imaging, fluorescence sensor 344 may be configured to receive incident fluorescent emissions and determine luminance lifetime information of the fluorescent emissions. In the example of fluorescence spectral imaging, fluorescence sensor 344 may be configured to determine luminance wavelength information of the fluorescent emissions. In the example of fluorescence intensity, fluorescence sensor 344 may be configured to determine luminance intensity information of the fluorescent emissions. In some embodiments, fluorescence sensor 344 may have one or more processors integrated thereon, and/or may be coupled to one or more processors onboard the imaging and/or measuring apparatus and configured to provide lifetime, wavelength, and/or intensity information to the processor(s) for image formation and/or measurement.
In some embodiments, fluorescence sensor 344 may be alternatively or additionally configured to capture IR light and perform IR imaging. For example, in some embodiments, an IR light source may be disposed among fluorescence detection components 340 and configured to transmit IR light towards the subject's eye (e.g., by reflection via fluorescence dichroic 324), and fluorescence sensor 344 may be configured to receive reflected IR light from the subject's eye (e.g., by reflection via fluorescence dichroic 324).
In some embodiments, white light detection components 350 may be configured to capture white light received from the subject's eye to produce one or more images and/or measurements of the subject's eye. As shown in
In some embodiments, white light camera 354 may be configured to produce one or more images and/or measurements of the subject's eye using light received via MV lenses 352. In some embodiments, white light camera 354 may include a color camera. In some embodiments, white light camera 354 may include a monochrome camera. In some embodiments, white light camera 354 may be configured to receive at least some fluorescent emission light via fluorescence dichroic 324. In some embodiments, white light camera 354 may be configured to compensate for differences in spectral power at wavelengths at least partially reflected by fluorescence dichroic 324. In some embodiments white light camera 354 may be configured to output image and/or measurement information to one or more processors onboard the imaging and/or measuring apparatus.
In some embodiments, white light and fluorescence components 500 may be configured in the manner described herein for white light and fluorescence components 300 including in connection with
In some embodiments, diopter flexure 772a may be configured to compress and/or decompress in response to a force exerted on diopter flexure 772a to adjust a positioning of MV lenses positioned along the axial direction through aperture 776. As shown in
As shown in
As shown in
In some embodiments, diopter motor 560 may be mechanically coupled to diopter flexure 572b to provide variable diopter compensation for the fluorescence detection components. For example, as shown in
In some embodiments, diopter motor 560 may be mechanically coupled to diopter flexure 572c to provide variable diopter compensation for the white light detection components via cam 564, such as by rotating cam 564 about the first direction Dir1. For example, as shown in
In some embodiments, diopter motor 560 may be configured to adjust diopter flexure 572a, 572b, and/or 572c with precision that can provide high resolution diopter compensation for the white light and/or fluorescence components 500. For example, diopter motor 560 may be configured to adjust diopter flexure 572a, 572b, and/or 572c with a precision of less than 100 micrometers and/or a precision of less than 50 micrometers. In this example, the precision with which diopter motor 560 is configured to adjust diopter flexures can correspond to movement distances from diopter to diopter of the lenses of the diopter flexures.
It should be appreciated that some embodiments may include multiple diopter motors, as embodiments described herein are not so limited. For example, a separate diopter motor may be coupled to each diopter flexure assembly 570a, 570b, and/or 570c.
Described herein are exemplary OCT and IR imaging and/or measuring components. Although the exemplary configurations described herein include each of OCT and IR imaging and/or measuring components, it should be appreciated that OCT and/or IR imaging and/or measuring components described herein may be included alone or in combination with one another and/or with other modes of imaging and/or measuring devices.
In some embodiments, source components 1310 may be configured to provide source light for illuminating the subject's eye via sample components 1320 and illuminating a reference surface 1346 of reference components 1340. In
In some embodiments, source components 1310 may alternatively or additionally include a vertical-cavity surface-emitting laser (VCSEL) with an adjustable mirror on one side. In some embodiments, the VCSEL may have a wavelength tuning range of greater than 100 nm using a micro-mechanical movement (MEMs). In some embodiments, source components 1310 may alternatively or additionally include a plurality of light sources that combine to achieve broad light spectral width, such as including a plurality of laser diodes, which can be a cost-effective way of achieving higher brightness and shorter pulse duration than SLDs in some cases.
In some embodiments, collimating cylindrical lens assembly 1314 may be configured to collimate light from SLD 1312 for illuminating the subject's eye. In some embodiments, source components 1310 may be configured to illuminate a line across the subject's eye to simultaneously perform a plurality of depth scans of the subject's eye. For example, collimating cylindrical lens assembly 1314 may be configured to transmit light from SLD 1312 in a line to illuminate the line across the subject's eye. In some embodiments, mirror 1316 may be configured to reflect the collimated light toward relay lenses 1318, which may be configured to relay the collimated light toward the subject's eye (e.g., pupil) via sample components 1320.
As shown in
In some embodiments, sample components 1320 may be configured to provide light from source components 1320 and fixation components 1370 to the subject's eye and provide light received from the subject's eye to OCT detection components 1350 and IR components 1380. In
In some embodiments, collimator lenses 1324 may be configured to provide a variable collimation of light from beam splitter 1322 toward the subject's eye via scan mirror 1326. For example, diopter motor 1390 may be configured to adjust a positioning of collimator lenses 1324 to adjust the collimation provided by collimator lenses 1324. In some embodiments, scan motor 1334 may be configured to steer scan mirror 1326 to steer light from beam splitter 1322 toward different portions of the subject's eye. For example, in some embodiments, source components 1310 may be configured to provide a line of illumination to scan mirror 1326, and scan mirror 1326 may be configured to steer the line of illumination across the subject's eye in a direction perpendicular to the line of illumination and perpendicular to the depth of the subject's eye. In some embodiments, the line of illumination may be horizontal across the subject's eye and scan mirror 1326 may be configured to steer the line of illumination vertically. It should be appreciated that any pair of perpendicular directions that are perpendicular to the depth direction of the subject's eye may be used for the illumination line and for steering using scan mirror 1326. In some embodiments, scan motor 1334 and/or scan mirror 1326 may include one or more stepper motors, galvanometers, polygonal scanners, microelectromechanical switch (MEMS) mirrors, and/or other moving mirror devices.
In some embodiments, IR dichroic 1328 may be configured to transmit light from beam splitter 1322 toward the subject's eye. In some embodiments, IR components 1380 may be configured to transmit IR illumination light to IR dichroic 1328, which may be configured to reflect the IR illumination light toward the subject's eye and reflect received IR light toward IR components 1380 for capturing an IR image and/or measurement of the subject's eye. In some embodiments, IR dichroic 1328 may be configured as a short-pass dichroic. In some embodiments, fixation dichroic 1330 may be configured to transmit light from beam splitter 1322 toward the subject's eye and reflect fixation light from fixation components 1370 toward the subject's eye. In some embodiments, fixation dichroic 1330 may be configured as a long-pass dichroic. In some embodiments, fixation dichroic 1330 may be configured to transmit at least some illumination and/or fixation light toward a PD, where the PD is configured to determine whether the amount of light to be transmitted toward the subject's eye exceeds a safety threshold.
In some embodiments, objective lenses 1332 may be configured to focus illumination light on the subject's eye and focus light received from the subject's eye such that the received light can be captured using OCT detection components 1350 and/or IR components 1380. In some embodiments, objective lenses 1332 may be configured in the manner described herein for objective lenses 328 including in connection with
In some embodiments, reference components 1340 may be configured to receive illumination light from source components 1310 via beam splitter 1322 and provide light reflected from reference surface 1346 to beam splitter 1322 to reflect toward OCT detection components 1350. In
In some embodiments, OCT detection components 1350 may be configured to receive light from the subject's eye and from reference components 1340 and capture an OCT image and/or measurement using the received light. In
In some embodiments, OCT sensor 1360 may be configured to capture one or more OCT images and/or measurements using light received via BE lenses 1352, transmissive grating 1354, focusing lens 1356, and field lenses 1358. In some embodiments, OCT sensor 1360 may be configured to determine a path length difference between light received from the subject's eye via sample components 1320 and light received from reference components 1340. For example, OCT sensor 1360 may be configured to determine a phase difference between the light received from the subject's eye and from reference components 1340. In some embodiments, OCT sensor 1360 may include an interferometer such as a Mach-Zehnder interferometer and/or a Michelson interferometer. In some embodiments in which source components 1310 include multiple laser diodes, the spectrum of each laser diode may be provided and/or superimposed by transmissive grating 1354 over separate wavelengths on OCT sensor 1360.
In some embodiments, fixation components 1370 may be configured to display to the subject's eye, via fixation dichroic 1330, a visible fixation object. In
In some embodiments, IR components 1380 may be configured to provide and/or capture IR light to and/or from the subject's eye via IR dichroic 1328. In
As shown in
The inventors have developed improved imaging techniques that may be implemented using imaging apparatuses described herein. According to various embodiments, such imaging techniques may be used for biometric identification, health status determination, and disease diagnosis, and others.
The inventors have recognized that various health conditions may be indicated by the appearance of a person's retina fundus in one or more images captured according to techniques described herein. For example, diabetic retinopathy may be indicated by tiny bulges or micro-aneurysms protruding from the vessel walls of the smaller blood vessels, sometimes leaking fluid and blood into the retina. In addition, larger retinal vessels can begin to dilate and become irregular in diameter. Nerve fibers in the retina may begin to swell. Sometimes, the central part of the retina (macula) begins to swell, such as macular edema. Damaged blood vessels may close off, causing the growth of new, abnormal blood vessels in the retina. Glaucomatous optic neuropathy, or Glaucoma, may be indicated by thinning of the parapapillary retinal nerve fiber layer (RNFL) and optic disc cupping as a result of axonal and secondary retinal ganglion cell loss. The inventors have recognized that RNFL defects, for example indicated by OCT, are one of the earliest signs of glaucoma. In addition, age-related macular degeneration (AMD) may be indicated by the macula peeling and/or lifting, disturbances of macular pigmentation such as yellowish material under the pigment epithelial layer in the central retinal zone, and/or drusen such as macular drusen, peripheral drusen, and/or granular pattern drusen. AMD may also be indicated by geographic atrophy, such as a sharply delineated round area of hyperpigmentation, nummular atrophy, and/or subretinal fluid.
Stargardt's disease may be indicated by death of photoreceptor cells in the central portion of the retina. Macular edema may be indicated by a trench in an area surrounding the fovea. A macular hole may be indicated by a hole in the macula. Eye floaters may be indicated by non-focused optical path obscuring. Retinal detachment may be indicated by severe optic disc disruption, and/or separation from the underlying pigment epithelium. Retinal degeneration may be indicated by the deterioration of the retina. Central serous retinopathy (CSR) may be indicated by an elevation of sensory retina in the macula, and/or localized detachment from the pigment epithelium. Choroidal melanoma may be indicated by a malignant tumor derived from pigment cells initiated in the choroid. Cataracts may be indicated by opaque lens, and may also cause blurring fluorescence lifetimes and/or 2D retina fundus images. Macular telangiectasia may be indicated by a ring of fluorescence lifetimes increasing dramatically for the macula, and by smaller blood vessels degrading in and around the fovea. Alzheimer's disease and Parkinson's disease may be indicated by thinning of the RNFL. It should be appreciated that diabetic retinopathy, glaucoma, and other such conditions may lead to blindness or severe visual impairment if not properly screened and treated.
In another example, optic neuropathy, optic atrophy and/or choroidal folding can be indicated in images captured using techniques described herein. Optic neuropathy and/or optic atrophy may be caused by damage within the eye, such as glaucoma, optic neuritis, and/or papilledema, damage along the path of the optic nerve to the brain, such as a tumor, neurodegenerative disorder, and/or trauma, and/or congenital conditions such as Leber's hereditary optic atrophy (LHOA) autosomal dominant optic atrophy (ADOA). For example, compressive optic atrophy may be indicated by and/or associated with such extrinsic signs as pituitary adenoma, intracranial meningioma, aneurysms, craniopharyngioma, mucoceles, papilloma, and/or metastasis, and/or such extrinsic signs as optic nerve glioma, optic nerve sheath (ONS) meningioma, and/or lymphoma. Vascular and/or ischemic optic atrophy be indicated by and/or associated with sector disc pallor, non-arteritic anterior ischemic optic neuropathy (NAION), arteritic ischemic optic neuropathy (AION), severe optic atrophy with gliosis, giant cell arteritis, central retinal artery occlusion (CRAO), carotid artery occlusion, and/or diabetes. Neoplastic optic atrophy may be indicated by and/or associated with lymphoma, leukemia, tumor, and/or glioma Inflammatory optic atrophy may be indicated by sarcoid, systemic lupus erythematosus (SLE), Behcet's disease, demyelination, such as multiple-sclerosis (MS) and/or neuromyelitis optica spectrum disorder (NMOSD) also known as Devic disease, allergic angiitis (AN), and/or Churg-Strauss syndrome. Infectious optic atrophy may be indicated by the presence of a viral, bacterial, and/or fungal infection. Radiation optic neuropathy may also be indicated.
Moreover, in some embodiments, an imaging apparatus may be configured to detect a concussion at least in part by tracking the movement of a person's eye(s) over a sequence of images. For example, iris sensors, white light imaging components, and/or other imaging components described herein may be configured to track the movement of the person's eyes for various indications of a concussion. Toxic optic atrophy and/or nutritional optic atrophy may be indicated in association with ethambutol, amiodarone, methanol, vitamin B12 deficiency, and/or thyroid ophthalmopathy. Metabolic optic atrophy may be indicated by and/or associated with diabetes. Genetic optic atrophy may be indicated by and/or associated with ADOA and/or LHOA. Traumatic optic neuropathy may be indicated by and/or associated with trauma to the optic nerve, ONS hematoma, and/or a fracture.
Accordingly, in some embodiments, a person's predisposition to various medical conditions may be determined based on one or more images of the person's retina fundus captured according to techniques described herein. For example, if one or more of the above described signs of a particular medical condition (e.g., macula peeling and/or lifting for AMD) is detected in the captured image(s), the person may be predisposed to that medical condition.
The inventors have also recognized that some health conditions may be detected using fluorescence imaging techniques described herein. For example, macular holes may be detected using an excitation light wavelength between 340-500 nm to excite retinal pigment epithelium (RPE) and/or macular pigment in the subject's eye having a fluorescence emission wavelength of 540 nm and/or between 430-460 nm. Fluorescence from RPE may be primarily due to lipofuscin from RPE lysomes. Retinal artery occlusion may be detected using an excitation light wavelength of 445 nm to excite Flavin adenine dinucleotides (FAD), RPE, and/or nicotinamide adenine dinucleotide (NADH) in the subject's eye having a fluorescence emission wavelength between 520-570 nm. AMD in the drusen may be detected using an excitation light wavelength between 340-500 nm to excite RPE in the subject's eye having a fluorescence emission wavelength of 540 nm and/or between 430-460 nm. AMD including geographic atrophy may be detected using an excitation light wavelength of 445 nm to excite RPE and elastin in the subject's eye having a fluorescence emission wavelength between 520-570 nm. AMD of the neovascular variety may be detected by exciting the subject's choroid and/or inner retina layers. Diabetic retinopathy may be detected using an excitation light wavelength of 448 nm to excite FAD in the subject's eye having a fluorescence emission wavelength between 590-560 nm. Central serous chorio-retinopathy (CSCR) may be detected using an excitation light wavelength of 445 nm to excite RPE and elastin in the subject's eye having a fluorescence emission wavelength between 520-570 nm. Stargardt's disease may be detected using an excitation light wavelength between 340-500 nm to excite RPE in the subject's eye having a fluorescence emission wavelength of 540 nm and/or between 430-460 nm. Choroideremia may be detected using an excitation light wavelength between 340-500 nm to excite RPE in the subject's eye having a fluorescence emission wavelength of 540 nm and/or between 430-460 nm.
The inventors have also developed techniques for using a captured image of a person's retina fundus to diagnose various health issues of the person. For example, in some embodiments, any of the health conditions described above may be diagnosed.
In some embodiments, imaging techniques described herein may be used for health status determination, which may include determinations relating to cardiac health, cardiovascular disease and/or cardiovascular risk, anemia, retinal toxicity, body mass index, water weight, hydration status, muscle mass, age, smoking habits, blood oxygen levels, heart rate, white blood cell counts, red blood cell counts, and/or other such health attributes. For example, in some embodiments, a light source having a bandwidth of at least 40 nm may be configured with sufficient imaging resolution capturing red blood cells having a diameter of 6 μm and white blood cells having diameters of at least 15 μm. Accordingly, imaging techniques described herein may be configured to facilitate sorting and counting of red and white blood cells, estimating the density of each within the blood, and/or other such determinations.
In some embodiments, imaging techniques described herein may facilitate tracking of the movement of blood cells to measure blood flow rates. In some embodiments, imaging techniques described herein may facilitate tracking the width of the blood vessels, which can provide an estimate of blood pressure changes and profusion. For example, an imaging apparatus as described herein configured to resolve red and white blood cells using a 2-dimensional (2D) spatial scan completed within 1 μs may be configured to capture movement of blood cells at 1 meter per second. In some embodiments, light sources that may be included in apparatuses described herein, such as super-luminescent diodes, LEDs, and/or lasers, may be configured to emit sub-microsecond light pulses such that an image may be captured in less than one microsecond. Using spectral scan techniques described herein, an entire cross section of a scanned line (e.g., in the lateral direction) versus depth can be captured in a sub-microsecond. In some embodiments, a 2-dimensional (2D) sensor described herein may be configured to capture such images for internal or external reading at a slow rate and subsequent analysis. In some embodiments, a 3D sensor may be used. Embodiments described below overcome the challenges of obtaining multiple high quality scans within a single microsecond.
In some embodiments, imaging apparatuses described herein may be configured to scan a line aligned along a blood vessel direction. For example, the scan may be rotated and positioned after identifying a blood vessel configuration of the subject's retina fundus and selecting a larger vessel for observation. In some embodiments, a blood vessel that is small and only allows one cell to transit the vessel in sequence may be selected such that the selected vessel fits within a single scan line. In some embodiments, limiting the target imaging area to a smaller section of the subject's eye may reduce the collection area for the imaging sensor. In some embodiments, using a portion of the imaging sensor facilitates increasing the imaging frame rate to 10 s of KHz. In some embodiments, imaging apparatuses described herein may be configured to perform a fast scan over a small area of the subject's eye while reducing spectral spread interference. For example, each scanned line may use a different section of the imaging sensor array. Accordingly, multiple depth scans may be captured at the same time, where each scan is captured by a respective portion of the imaging sensor array. In some embodiments, each scan may be magnified to result in wider spacing on the imaging sensor array, such as wider than the dispersed spectrum, so that each depth scan may be measured independently.
Having thus described several aspects and embodiments of the technology set forth in the disclosure, it is to be appreciated that various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be within the spirit and scope of the technology described herein. For example, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the embodiments described herein. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described. In addition, any combination of two or more features, systems, articles, materials, kits, and/or methods described herein, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
The above-described embodiments can be implemented in any of numerous ways. One or more aspects and embodiments of the present disclosure involving the performance of processes or methods may utilize program instructions executable by a device (e.g., a computer, a processor, or other device) to perform, or control performance of, the processes or methods. In this respect, various inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement one or more of the various embodiments described above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various ones of the aspects described above. In some embodiments, computer readable media may be non-transitory media.
The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects as described above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present disclosure need not reside on a single computer or processor, but may be distributed in a modular fashion among a number of different computers or processors to implement various aspects of the present disclosure.
Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer, as non-limiting examples. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smartphone or any other suitable portable or fixed electronic device.
Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible formats.
Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
The acts performed as part of the methods may be ordered in any suitable way.
Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
The terms “front” and “rear,” used herein in the context of describing the exemplary imaging and/or measuring apparatuses and portions thereof shown in the drawings, refer to portions of the imaging and/or measuring apparatus facing and/or positioned proximate the subject to be imaged and facing and/or positioned opposite from the subject to be imaged, respectively. It should be appreciated that imaging and/or measuring apparatuses could take other forms in which elements or views described herein as “front” or “rear” may other directions or be positioned differently with respect to the subject or subjects to be imaged, as embodiments described herein are not so limited.
The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having,” “containing,” “involving,” and variations thereof herein, is meant to encompass the items listed thereafter and equivalents thereof as well as additional items.
In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of” shall be closed or semi-closed transitional phrases, respectively.
This application claims the benefit under 35 U.S.C. § 119(e) of: U.S. Provisional Patent Application Ser. No. 63/155,866, filed Mar. 3, 2021, under Attorney Docket No.: T0753.70022US01, and entitled, “PORTABLE EYE IMAGING AND/OR MEASURING APPARATUS;” and U.S. Provisional Patent Application Ser. No. 63/047,536, filed Jul. 2, 2021, under Attorney Docket No.: T0753.70022US00, and titled, “NOVEL FUNDUS IMAGER,” each application of which is hereby incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63155866 | Mar 2021 | US | |
63047536 | Jul 2020 | US |