The disclosed invention relates to devices, systems, and methods for treating dry eye syndrome by recording and analyzing video of the anterior eye using an illumination device with attached mobile computer, wherein the recordings can include an overlaid simplified Placido ring pattern. Also disclosed are methods to use the recorded videos to measure and analyze dry eye-related parameters, to perform dry eye condition monitoring, and to provide dry eye disease self-treatment guidance.
Dry eye syndrome (DES) is a prevalent condition among all adults. Based on data from the National Health and Wellness Survey, 6.8 percent of the United States adult population (approximately 16.4 million people) have been diagnosed with dry eye syndrome (Farrand, K. F., et al., Am. J. Ophthalmol. 2017; 182:90.). DES affects middle-aged and older adults, contact lens wearers, computer users, gamers, and people living and/or working in dry environments. There is no cure for DES, but symptoms can be relieved with proper diagnosis and treatment.
DES occurs when the eyes do not produce enough tears, or when tears evaporate too quickly or are obstructed from the surface of the eye. Current methods to diagnose DES involve measurements of the following: tear film break-up time, tear film thickness, lipid layer thickness, and/or water sample layer thickness, as well as performing tests, such as a symptom assessment questionnaire, a fluorescein staining test, a Schirmer test, or another appropriate test.
There are several devices in the art for diagnosing and monitoring DES that use corneal topography measurement methods, which typically illuminate the eye with a set of concentric lighted rings, known as a Placido Ring or keratoscope pattern. Corneal topography began as a means of treatment for astigmatism and other eye irregularities affecting the topography or shape of the cornea. Such devices are often repurposed to treat DES, and commonly include desktop cameras such as KOWA DR-1α from Kowa Company Ltd, OCULUS Keratograph® 5M from Oculus Inc., and E300 Corneal Topographer from K+Medmont.
A subset of corneal topography devices is designed specifically for DES treatment, for example, TEARSCIENCE LIPIVIEW™ II from Johnson & Johnson. There are several patents or patent applications that disclose such devices, which include, for example, U.S. Pat. No. 10,980,413, which discloses a device and accompanying method for diagnosing, measuring, or analyzing DES, and discusses the ability to determine tear film break-up time and to detect lid margin contact and blink rates. Similarly, U.S. Pat. No. 8,888,286 discloses a DES-specific device capable of measuring the tear film layer thickness of the ocular tear film, a lipid layer thickness, and an aqueous layer thickness on the ocular surface. Another DES-specific device is disclosed in U.S. Pat. No. 8,591,033, wherein the device can measure the relative thickness of the lipid layer of the precorneal tear film after it is disturbed by blinking. Japan Pat. No. JP2010273800A discloses a DES device and method for measuring an individual's blinking action. Finally, WIPO Application No. WO2015073664A2 discloses a device and accompanying method for diagnosing, measuring, or analyzing DES, to include the detection of eyelid margin contact and blink rates. Additionally, portable corneal topography measurement devices for medical professionals are in use, such as EasyTear@ View+ from EasyTear, and Tearscope from Keeler. These devices can be attached to a split lamp to provide illumination, and a mobile computing device, e.g., a tablet, for user interface and data processing.
Unfortunately, because the precision required of corneal topography devices for treating eye irregularities is greater than the precision required to treat DES, such devices tend to be more expensive than is necessary to provide reliable DES diagnosis and monitoring. In addition to being prohibitively costly, such DES treatment devices are typically non-portable, and only provided to medical professionals, meaning there is no way for patients to monitor DES at home.
Also known in the art are portable devices to be used with a smart phone for assessing and treating eye irregularities. As with other corneal topography devices, portable versions are prohibitively expensive, provide unnecessary capabilities, and are not optimized for DES diagnosis and treatment. For example, U.S. Pat. No. 9,839,352 discloses a corneal topography device that includes a Placido disc illumination system configured to be used with the camera of a mobile communication device.
Therefore, a clear need exists for a portable DES diagnosis and monitoring device that allows a clinician to rapidly record and measure a patient's dry eye condition. The clinical device disclosed includes a camera for each of a patient's right eye and left eye so that both eyes can be imaged simultaneously. Accompanying software changes allow automatic right eye and left eye identification and analysis. Also included is the capability to adjust the interpupillary distance (IPD) between the cameras, as well as the up and down position of the cameras to allow the cameras to be positioned correctly in front of a patient's eyes. Such embodiments further include a capability to change the Placido ring projector pattern rapidly.
These and other deficiencies of the prior art are addressed by one or more embodiments of the disclosed invention. Additional advantages and novel features of this invention shall be set forth in part in the description that follows, and in part will become apparent to those skilled in the art upon examination of the following specification or may be learned by the practice of the invention. The advantages of the invention may be realized and attained by means of the instrumentalities, combinations, compositions, and methods particularly pointed out hereafter.
The features and advantages described in this disclosure and in the following detailed description are not all-inclusive. Many additional features and advantages will be apparent to one of ordinary skill in the relevant art in view of the drawings, specification, and claims hereof. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes and may not have been selected to delineate or circumscribe the inventive subject matter; reference to the claims is necessary to determine such inventive subject matter.
Features and objects of the present invention and the manner of attaining them will become more apparent, and the invention itself will be best understood, by reference to the following description of one or more embodiments taken in conjunction with the accompanying drawings and figures imbedded in the text below and attached following this description.
The Figures depict embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
Dry eye syndrome (DES) is a medical condition having various causes and is characterized by having dry eyes. DES is accompanied by related symptoms, such as eye irritation and redness, fluid discharge from the eyes, and eyes that tire easily. Symptoms may range from mild and occasional to severe and continuous. The condition is also known as keratoconjunctivitis sicca.
The disclosed invention includes devices, systems, and methods for dry eye condition measurement and analysis that are intended for day-to-day use by laypeople. Disclosed are embodiments of an illuminator device, including a handheld version and a smartphone-mounted version. Also disclosed are embodiments of a mobile software application, and accompanying methods to perform dry eye syndrome diagnosis, assessment, and treatment. The illuminator devices are configured to perform anterior eye video recording and can overlay such recordings with projected grid patterns. The application operates the illuminator, performs video analysis, computes dry eye-related parameters, summarizes the measured values, presents longitudinal data trends, and guides self-treatment. It also provides user administrative functions and facilitates data storage locally and on cloud servers.
The present invention employ Artificial Intelligence(AI) parameters to determine and categorize the severity of a patient's dry eye condition and provide corresponding treatment recommendations. In one instance, parameters including blink rate, tear meniscus height measurement are used to provide a quick assessment of the patient's condition to recommend over-the-counter dry eye treatment solutions. In another instance, parameters including, blink rate, tear meniscus height measurement, iris surface interferometric pattern, and tear film pattern with Placido Ring Projection are used to provide an in-depth analysis of the patient's dry eye condition within an ophthalmic setting. In both instances, each clinical parameter is evaluated and assigned a weighting, in alignment with a pre-determined weighting system, to provide a clinically relevant dry eye categorization for the patient, with corresponding treatment recommendations based on the severity of the condition detected.
The disclosed invention will now be described in detail with reference to several embodiments thereof as illustrated in the accompanying Figures. In the following description, specific details are set forth to provide a thorough understanding of embodiments of the disclosed invention. It will be apparent, however, to one skilled in the art that embodiments may be practiced without some or all these specific details. In other instances, well known process steps and/or structures have not been described in detail to not unnecessarily obscure the invention. The features and advantages of embodiments may be better understood with reference to the drawings and discussions that follow.
It should be apparent to those skilled in the art that the described embodiments of the disclosed invention provided herein are illustrative only and not limiting, having been presented by way of example only. All features disclosed in this description may be replaced by alternative features serving the same or similar purpose, unless expressly stated otherwise. Therefore, numerous other embodiments of the modifications thereof are contemplated as falling within the scope of the disclosed invention as defined herein and equivalents thereto. Hence, use of absolute and/or sequential terms, such as, for example, “always,” “will,” “will not,” “shall,” “shall not,” “must,” “must not,” “first,” “initially,” “next,” “subsequently,” “before,” “after,” “lastly,” and “finally,” are not meant to limit the scope of the disclosed invention as the embodiments disclosed herein are merely exemplary.
It will be also understood that when an element is referred to as being “on,” “attached” to, “connected” to, “coupled” with, “contacting”, “mounted” etc., another element, it can be directly on, attached to, connected to, coupled with, or contacting the other element or intervening elements may also be present. In contrast, when an element is referred to as being, for example, “directly on,” “directly attached” to, “directly connected” to, “directly coupled” with or “directly contacting” another element, there are no intervening elements present. It will also be appreciated by those of skill in the art that references to a structure or feature that is disposed “adjacent” another feature may have portions that overlap or underlie the adjacent feature.
Spatially relative terms, such as “under,” “below,” “lower,” “over,” “upper” and the like, may be used to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. Such spatially relative terms are intended to encompass different orientations of a device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as “under,” or “beneath” other elements or features would then be oriented “over” the other elements or features. Thus, the exemplary term “under” can encompass both an orientation of “over” and “under”. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. Similarly, the terms “upwardly,” “downwardly,” “vertical,” “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.
Included in the description are flowcharts depicting examples of the methodology which may be used in a travel system for individuals with cognitive disabilities. In the following description, it will be understood that each block of the flowchart illustrations, and combinations of blocks in the flowchart illustrations, can be implemented by computer program instructions. These computer program instructions may be loaded onto a computer or other programmable apparatus to produce a machine such that the instructions that execute on the computer or other programmable apparatus create means for implementing the functions specified in the flowchart block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means that implement the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed in the computer or on the other programmable apparatus to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
Accordingly, blocks of the flowchart illustrations support combinations of means for performing the specified functions and combinations of steps for performing the specified functions. It will also be understood that each block of the flowchart illustrations, and combinations of blocks in the flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.
One of reasonable skill will also recognize that portions of the present invention may be implemented on a general-purpose mobile computing system, such as a smartphone, a personal communication device, a mobile device, a notebook computer, a tablet, or the like.
CPU 101 comprises a suitable processor for implementing the present invention. The CPU 101 communicates with other components of the system via a bi-directional system bus 120 (including any necessary input/output (I/O) controller 107 circuitry and other “glue” logic). The bus, which includes address lines for addressing system memory, provides data transfer between and among the various components. Random-access memory 102 serves as the working memory for the CPU 101. The read-only memory (ROM) 103 contains the basic input/output system code (BIOS), a set of low-level routines in the ROM that application programs and the operating systems can use to interact with the hardware, including reading characters from the touchscreen or keyboard, outputting characters to screens or printers, and so forth.
Mass storage devices 115, 116 provide persistent storage on fixed and removable media, such as magnetic, optical, or magnetic-optical storage systems, flash memory, or any other available mass storage technology. The mass storage may be shared on a network 150, or it may be a dedicated mass storage. As shown in
In basic operation, program logic (including that which implements methodology of the present invention described below) is loaded from the removable storage 115 or fixed storage 116 into the main (RAM) memory 102, for execution by the CPU 101. During operation of the program logic, the system 100 accepts user input from a keyboard and pointing device, as well as speech-based input from a voice recognition system (not shown). The user interface permits selection of application programs, entry of keyboard-based input or data, and selection and manipulation of individual data objects displayed on the screen or display device 105. Likewise, the pointing device, such as a mouse, track ball, pen device, touch screen, or the like, permits selection and manipulation of objects on the display device. In this manner, these input devices support manual user input for any process running on the system.
The computer system 100 displays text and/or graphic images and other data on the display device 105. The video adapter 104, which is interposed between the display 105 and the system's bus, drives the display device 105. The video adapter 104, which includes video memory accessible to the CPU 101, provides circuitry that converts pixel data stored in the video memory to a raster signal suitable for use by a monitor or touchscreen. A hard copy of the displayed information, or other information within the system 100, may be obtained from a printer or other output device.
The system itself communicates with other devices (e.g., other computers) via the network interface card (NIC) 111 connected to a network (e.g., cellular network, Wi-Fi network, Bluetooth wireless network, or the like). The system 100 may also communicate with local occasionally connected devices (e.g., serial cable-linked devices) via the communication (COMM) interface 110, which may include a RS-232 serial port, a Universal Serial Bus (USB) interface, or the like. Devices that will be commonly connected locally to the interface 110 include laptop computers, handheld organizers, digital cameras, and the like.
The system itself communicates with other devices (e.g., other handheld devices or computers) via the NIC 111 connected to a network (e.g., cellular network, Wi-Fi network, Bluetooth wireless network, etc.). The system 100 may also communicate with local occasionally connected devices (e.g., serial cable-linked devices) via the COMM interface 110, which may include a RS-232 serial port, a Universal Serial Bus (USB) interface, or the like. Devices that will be commonly connected locally to the interface 110 include laptop computers, handheld computers, digital cameras, etc.
The system may be implemented through various wireless networks and their associated communication devices. Such networks may include modems, mainframe computers, or servers, such as a gateway computer or application server which may have access to a database. A gateway computer serves as a point of entry into each network and may be coupled to another network by means of a communications link. The gateway may also be directly or indirectly coupled to one or more devices using a communications link or may be coupled to a storage device such as a data repository or database.
The disclosed invention includes devices, systems, and methods for the non-invasive diagnosis and treatment of DES in a portable format suitable for everyday use by patients. The invention includes a system comprising embodiments of an illuminator device for eye examination that is configured for use with a smartphone camera, and a software application (app) configured to run on a smartphone. Also included are methods for use of the system to perform DES diagnosis and treatment. Using the smartphone camera, the disclosed device records video imagery of a patient's eye. The accompanying application analyzes feature distortions on the ocular surface by performing feature extraction on the recorded video. Using such data, the application determines tear film break-up time and meniscus layer height, among other DES-relevant parameters. The application can compare these results to historical results gathered on a given individual, a baseline value for the individual, or a normalized baseline for healthy individuals to diagnose or measure the progression of DES.
The disclosed invention includes devices, systems, and methods for the non-invasive diagnosis and treatment of DES in a portable format suitable for use in a clinical setting. The invention includes a system comprising embodiments of a binocular illuminator device for eye examination that includes components for the rapid imaging and analysis of both eyes simultaneously. The binocular illuminator includes an imaging component for each eye, each imaging component including a camera, light source, simplified Placido ring projector, and lens system. The binocular illuminator also includes an adjustment mechanism to move the imaging components closer together or further apart to adjust for a patient's IPD. This embodiment further includes an adjustment mechanism to move the imaging components up or down relative to the casing to adjust for alignment with the patient's eyes. Each imaging component includes a removable Placido ring projector and a mechanism for rapidly changing the ring pattern in use.
The illuminator is configured for use with a smartphone that can be installed in the binocular illuminator case, and a software application (app) configured to run on a smartphone. Also included are methods for use of the system to perform DES diagnosis and treatment. Using the dual onboard cameras, the disclosed device records video imagery of each of a patient's eyes and the application stitches the images into a composite image of the two eyes together, allowing the eyes to be imaged and analyzed together, speeding up the treatment process. The application also assesses the IPD and up and down alignment and informs the user if the imaging components need to be adjusted to be properly positioned and aligned in front of each eye.
The accompanying application analyzes feature distortions on the ocular surface by performing feature extraction on the recorded video. Using such data, the application determines tear film break-up time and meniscus layer height, among other DES-relevant parameters. The application can compare these results to historical results gathered on a given individual, a baseline value for the individual, or a normalized baseline for healthy individuals to diagnose or measure the progression of DES.
With reference to
The camera 220 is housed in a first casing 214 that mechanically interacts with a second casing 230 to house the illuminator components. The second casing includes side walls 232 that attach to the first casing, and an eye cup 234 configured to ergonomically interact with an eye socket of a patient. The eye cup 234 may be shaped to conform to the patient's face to block out ambient light and may include a rounded lip 236 to improve user comfort. The lip may be rubberized or covered with a flexible material such as foam rubber (not shown) to further improve comfort or to block out additional ambient light. Within the eye cup, the second casing includes an oculus 238 or opening that provides access to the patient's eye. The eye cup 234 is rotatably mounted so that it may be oriented to cover the right or left eye. As shown, the eye cup is oriented for placement over a patent's left eye. The eye cup would be rotated 180 degrees for placement over the patient's right eye. In the depicted embodiment, the second casing 230 rotates on the first casing 214 to allow the eye cup to rotate relative to the camera, but other arrangements are possible and contemplated.
Housed within the casings 214, 230, the illuminator includes a light ring 240 to provide illumination for device operation. The light ring includes a plurality of high intensity white lights arranged in a circular pattern around the camera, and may be, e.g., a plurality of white light emitting diodes (LED), or other suitable high intensity, low power light source. The light ring 240 receives power from the battery and is configured to provide sufficient illumination for operation of a projector cone 250. Brightness of the light ring can be manually adjusted by use of a control knob (not shown), for example, to provide additional illumination of the cornea, or to reduce brightness for improved patient comfort.
The projector cone 250 comprises a transparent or translucent plastic film arranged in a truncated cone shape and is printed with a ring pattern. The cone has a narrow end that encircles the camera lens system 260, and a wide end that fits around the circumference of the oculus 238. The projector cone 250 is arranged so that light from the light ring 240 shines through the sides of the cone and projects the ring pattern onto the cornea. The projector cone 250 is removable and interchangeable with other projector cones that may be printed with different grid or ring patterns, each of which is suitable for a different application. The lens system 260 is situated between the camera 220 and the patient's eye and focuses light entering the camera to provide suitably clear images of the patient's eye for DES diagnosis and treatment. Like the light ring, the lens system is powered by the battery.
With reference to
The illuminator 300 is configured for use with a software application hosted on a smartphone or other mobile device. The user (the user and patient may be the same individual) first connects the illuminator to the smartphone via, e.g., a USB connector or via Bluetooth, and controls and operates the illuminator through the application. From the application, the user activates the illuminator camera 320 to record video of a patient's eye. The user places the illuminator eye cup over the patient's eye socket, and the application accesses imagery from the camera. The application then uses a machine learning model to detect when the eye is in focus and correctly aligned within the projector cone 350 ring pattern. The application also uses video analytics to determine whether the left eye or the right eye is being imaged. Once focus and alignment conditions are met, the application automatically begins recording video and instructs the patient to perform a series of blinks. The application then evaluates whether a video of the eye of acceptable quality has been captured, including checking the blink sequence, the length of video, the eye alignment, and focus. When a suitable video is recorded, the application informs the user, and performs analysis on the recorded video. Then the application reports the results to the user.
With reference to
With reference to
Also like the previous embodiment, the mounted embodiment is operated through the software application run on the smartphone. The user/patient first mounts the illuminator onto the smartphone by sliding the smartphone into the mounting slot until the phone is seated in place. The illuminator is then connected to the smartphone via USB connector or other suitable means, e.g., near field communication system, Bluetooth, etc., and the user controls and operates the illuminator through the application. The user places the illuminator eye cup over the patient's eye socket, and the application accesses imagery from the smartphone camera. From the application, the user activates the smartphone camera to record video of a patient's eye. The remainder of the application functionality is like the previous embodiment.
Some embodiments of the mounted illuminator are configured with an alternate system for activating the smartphone camera. In such embodiments, illuminator is further equipped with an electromagnet (not shown) powered by the power source and electronically connected to the light ring. The application is further configured to monitor the smartphone compass for variations. When the application is used to turn on the light ring, the electromagnet is also turned on, creating a small change in the magnetic flux captured by the compass. The application detects the change detected by the compass and activates the camera to record video of the cornea. The electromagnet system adds a backup actuation means to ensure the smartphone camera activates on time in the event the app's camera focus and alignment determinations do not consistently initiate video recording.
The illuminator embodiments described herein improve existing DES systems in several ways, three of which are identified here. The first improvement is reduced ring pattern complexity. Placido rings or keratoscopes used to identify eye irregularities project a detailed, fine ring pattern onto the eye to determine the curvature of the cornea, which has much higher precision requirements than DES treatment. As disclosed, the illuminator uses a much-simplified projected Placido ring pattern designed for use with the discrete light sources of the light ring. The light sources are arranged along the ring so that when they shine through the projector cone and reflect of the cornea, they provide the necessary illumination and reference points for the application to determine various metrics relevant to DES. The simplified ring pattern needs only provide consistent features with which to compare the distortion of light as it reflects off the fluid coating the eye.
Another advantage of the disclosed DES treatment system is the shape of the eye cup, which automatically aligns the camera with the patient's cornea and simplifies camera focusing. When the user/patient fits the eye cup over the eye, the camera self-aligns with, and is located a fixed distance from, the cornea. The eye cup thus allows self-testing by eliminating the need for a trained user, or reliance on camera autofocus. The eye cup design also improves the quality of images and thus improves result consistency by limiting interference from ambient light.
In addition, embodiments of the disclosed DES device perform automatic eye placement recognition. The rotatable eye cup allows use of the illuminator on either the left or the right eye without changing illuminator orientation, and application software uses video analytics to recognize which eye is it facing. The illuminator system can therefore calibrate its analysis for the appropriate eye without reliance on gyroscope information.
In another embodiment, the disclosed invention includes devices, systems, and methods for dry eye condition measurement and analysis that are intended for portable DES diagnosis and monitoring device that allows a clinician to rapidly record and measure a patient's dry eye condition. Whereas the prior embodiments present a single camera system with Placido Ring projection and non-projection pattern that can scan the left and right eye separately and record multiple individual videos, another embodiment of the present invention provides a binocular camera system. Disclosed hereafter are embodiments of a binocular illuminator device for clinical use, including a camera for each of a patient's right eye and left eye so that both eyes can be imaged simultaneously. The binocular illuminator devices are configured to perform anterior eye video recording and can overlay such recordings with projected grid patterns. The video captures both eyes simultaneously, and thereafter separates the left and right eyes for analysis. Also included is the capability to adjust the interpupillary distance (IPD) between the cameras to allow the cameras to be positioned correctly in front of a patient's eyes. Such embodiments further include a capability to change the Placido ring projector pattern rapidly.
With reference to
The cameras are configured to capture videos of the cornea and are housed in a casing to house the binocular illuminator components. The imaging components 610 are moveable relative to each other to adjust the IPD between the focal point of each camera. The distance between camera focal points can be adjusted by a manual thumbwheel 650 to ensure a patient's eyes are left/right aligned with the cameras for effective imaging. The imaging components are also moveable up or down relative to the casing to adjust the up/down alignment of the focal point of each camera with the eyes. An up and down adjustment wheel may also be included in the casing. The imaging components are mounted on suitable mechanisms 660 to allow left-right and up-down adjustments, such as sets of gears and rails that are configured to mechanically interact. The casing includes side walls, a top and bottom, and two eye cups located in the front of the illuminator that are configured to ergonomically interact with the left and right eye sockets of a patient. The eye cups may be shaped to conform to the patient's face to block out ambient light and may include a rounded lip to improve user comfort. The lip may be rubberized or covered with a flexible material such as foam rubber (not shown) to further improve comfort and/or to block out additional ambient light. Within the eye cups, the casing includes an oculus or opening that provides access to the patient's eye. Some embodiments may include a skirt or lip (not shown) surrounding both eye cups to block out ambient light.
Housed within the casing, the illuminator includes two light rings to provide illumination for device operation. The light rings include a plurality of high intensity white lights arranged in a circular pattern around each camera, and may be, e.g., a plurality of white light emitting diodes (LED), or other suitable high intensity, low power light source. The light ring receives power from the illuminator and is configured to provide sufficient illumination for operation of a projector cone. Brightness of the light ring can be manually adjusted by use of a control knob (not shown), for example, to provide additional illumination of the cornea, or to reduce brightness for improved patient comfort. A switch for powering the light rings on or off is also included.
The projector cone for each imaging component comprises a transparent or translucent plastic film arranged in a truncated cone shape and is printed with a ring pattern. The cone has a narrow end that encircles the camera lens system, and a wide end that fits around the circumference of the oculus. The projector cone is arranged so that light from the light ring shines through the sides of the cone and projects the ring pattern onto the cornea. The projector cone is removable and interchangeable with other projector cones that may be printed with different grid or ring patterns, each of which is suitable for a different application. The binocular illuminator includes a magnetic attachment or screw on mechanism to allow the projector cones to be changed rapidly. The lens system is situated between the camera and the patient's eye and focuses light entering the camera to provide suitably clear images of the patient's eye for DES diagnosis and treatment. Like the light ring, the lens system is powered by the battery or smartphone.
With reference to
The binocular illuminator is configured for use with a software application hosted on a smartphone or other mobile device. The user (the user and patient may be the same individual) first connects the illuminator to the smartphone via, e.g., a USB connector, Bluetooth, or the like and controls and operates the illuminator through the application. From the application, the user activates the illuminator cameras to record video of a patient's eyes. The user places the illuminator eye cups over the patient's eye sockets, and the application accesses imagery from the cameras. The application then uses a machine learning model to detect when the eyes are in focus and correctly aligned within the projector cone ring pattern of each imaging component.
With reference to
With reference to
With reference to
With reference to
When the distance of either eye exceeds 1360 the maximum vertical tolerance a similar illumination of LEDs occurs. When the IPD is too low the upper colored LEDs on the left and right illuminators will illuminate 1365 directing the user to adjust the IPD higher. When the IPD is too high the lower colored LEDs on the left and right illuminator will illuminate 1370 directing the user to adjust the IPD lower. When the IPD is correct, the LEDs will not illuminate. Thus, the patient strives to turn off the LEDs to arrive at the correct alignment, ending the process 1390.
Once focus, IPD, and up/down alignment conditions are met, the application automatically begins recording video of both eyes and instructs the patient to perform a series of blinks. Video of each eye is stitched together into a composite image of both eyes together. The application then evaluates whether a video of acceptable quality has been captured, including checking the blink sequence, the length of video, the eye alignment, and focus. When a suitable video is recorded, the application informs the user, and performs analysis on the recorded video. The application then reports the results to the user.
The mobile software application is the primary system for providing operational, analytic, and general functionalities for DES diagnosis and treatment through use of the illuminator device. As described above, the application controls illuminator operational functions, including left or right eye detection, camera focus and alignment, light ring activation, and anterior eye video recording. Certain of these functions, including left or right eye detection and camera focus and alignment, IPD determination, and camera focus are performed using machine learning models. Video recording also includes issuing instructions to the patient for when to blink, as well as performing an initial video quality assessment.
The application also performs digital image processing on a recorded video to allow DES analysis and parameter measurement. The application uses image processing techniques in conjunction with deep learning algorithms to perform object detection and segmentation of the anterior eye images. Using such techniques, the application can recognize the locations of the iris and lower tear meniscus and recognize when an eyelid is closed or open during a blink. Once features are identified, the application performs segmentation of relevant features, e.g., the lower tear meniscus. The application also uses a ring pattern from the projector cone to divide the eye into separate regions and performs image segmentation on each region. Image processing is sophisticated enough to ignore certain image artifacts, such as eyelash shadows or other image imperfections. Note that a Placido ring projection is not required for tear meniscus height measurements, blink rate calculations, and tear interferometric pattern analysis. A corneal surface analysis is conducted without a Placido Ring projection to measure a tear meniscus height and/or an optimal tear meniscus height.
Once it completes object detection and image segmentation on the recorded video, the application can analyze the processed images to characterize parameters relevant to DES. For example, the application uses the ring pattern segmentation to perform a grid projection deformation analysis. The deformation analysis allows the application to measure the tear film breakup time. Recognition of blink events and timing their frequency allows the application to characterize a blink rate. Analysis results can be refined or corrected based on user observations or user review of intermediate data. The application may also compare corneal lipid imagery with example interferometric patterns to perform corneal lipid layer interferometric pattern recognition. Corneal lipid layer analysis may also be accomplished manually if insufficient data is collected, requiring the user to manually compare the corneal lipid images to known interferometric patterns.
Besides the operational, image processing and analysis functions, the application also provides some general functionalities. These include administrative functions, such as user registration and login, as well as results reporting and summarization. A user/patient can access a report on the current eye scan results or generate a historical trend report over a selected period. The application can also provide dry eye self-treatment instructions and suggestions based on a patient's current or historical results. The application can also send smartphone notifications to the patient, such as a prompting a regular eye scan, or prompting eye drop administration or other self-treatments. Finally, the application has connectivity functionalities to provide a patent's eye scan data to a cloud server for storage, intermediate processing or analysis, or advanced data processing or analysis.
With reference to
Once videos of sufficient quality are acquired, the application performs digital image processing, and performs analysis on the videos. The application will then instruct the user to perform manual adjustments to intermediate or final results if such adjustments are required by change criteria. Once the analysis is complete, the application provides the user a summary page displaying the current eye scan results 1414. If longitudinal data is available, i.e., the user has performed multiple eye scans over a period of time, the application will prompt the user to select a period for the display of trend data, and the application will produce a trend report 1415 for the user to compare eye health over the course of the reported period. Finally, the application will provide the user with tailored self-treatment suggestions, eye care tips, and useful external links related to DES based on the user's reported current and historical results 1416.
With reference to
With reference to
The method then uses the detected lower tear meniscus regions to perform analysis. First the method isolates the detected tear meniscus regions from each frame by cropping out a rectangular image of the tear meniscus region 1614. Then the method locates the center of each cropped rectangle 1615, and crops out a smaller section, e.g., 200×100 pixels centered on the rectangle center 1616. The method then uses another trained learning model to perform digital image processing on the smaller sections, which yields a segmentation and mask of the tear meniscus 1617. The method next uses the tear meniscus mask to measure a tear meniscus height and determine a tear meniscus location 1618. The method performs steps 1614 through 1618 on three consecutive frames and uses the results to determine an optimal tear meniscus height 1619.
With reference to
With reference to
Also disclosed are embodiments of a mobile software application, and accompanying methods to perform dry eye syndrome diagnosis, assessment, and treatment. Software application embodiments for the binocular illuminator allow automatic right eye and left eye identification and analysis. The application operates the illuminator, performs video analysis, computes dry eye-related parameters, summarizes the measured values, presents longitudinal data trends, and guides self-treatment. It also provides user administrative functions and facilitates data storage locally and on cloud servers.
With reference to
With reference to
With reference to
With reference to
With reference to
With reference to
Some portions of this specification are presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). These algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve the manipulation of information elements. Typically, but not necessarily, such elements may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” “words,” “materials,” etc. These specific words, however, are merely convenient labels and are to be associated with appropriate information elements.
Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.
Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for a system and a process for cognitive training through the disclosed principles herein. Thus, while embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes, and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope of the invention.
It will also be understood by those familiar with the art, that the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Likewise, the naming and division of the modules, managers, functions, systems, engines, layers, features, attributes, methodologies, and other aspects are not mandatory or significant, and the mechanisms that implement the invention or its features may have different names, divisions, and/or formats. Furthermore, as will be apparent to one of ordinary skill in the relevant art, the modules, managers, functions, systems, engines, layers, features, attributes, methodologies, and other aspects of the invention can be implemented as software, hardware, firmware, or any combination of the three. Of course, wherever a component of the present invention is implemented as software, the component can be implemented as a script, as a standalone program, as part of a larger program, as a plurality of separate scripts and/or programs, as a statically or dynamically linked library, as a kernel loadable module, as a device driver, and/or in every and any other way known now or in the future to those of skill in the art of computer programming. Additionally, the present invention is in no way limited to implementation in any specific programming language, or for any specific operating system or environment.
While this invention has been described in terms of several embodiments, there are alterations, modifications, permutations, and substitute equivalents, which fall within the scope of this invention. Although subsection titles have been provided to aid in the description of the invention, these titles are merely illustrative and are not intended to limit the scope of the present invention. In addition, where claim limitations have been identified, for example, by a numeral or letter, they are not intended to imply any specific sequence.
This has been a description of the disclosed invention along with a preferred method of practicing the invention.
The present application is a continuation-in-part of and claims priority to U.S. Non-Provisional patent application Ser. No. 18/298827, filed 11 Apr. 2023 which claims priority to U.S. Provisional Application Ser. No. 63/372842, filed 11 Apr. 2022, and U.S. Provisional Application Ser. No. 63/495,411, filed 11 Apr. 2023, both of which are hereby incorporated by reference in their entirety for all purposes as if fully set forth herein.
Number | Date | Country | |
---|---|---|---|
63495411 | Apr 2023 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 18298827 | Apr 2023 | US |
Child | 18631190 | US |