Testing visual fields i.e., assessing the sensitivity for contrast and other functions of the visual system, is a standard in eye care especially in glaucoma diagnosis and follow up. Visual field testing is typically performed by displaying stimuli to a subject and detecting their response using large instruments one eye at a time (since defects in one eye could be compensated by the other eye). For instance,
One other type of visual field testing system is a screen-based testing system (see for example U.S. Pat. Nos. 8,371,696; 5,912,723; and D472637, each of which are hereby incorporated by reference). U.S. Pat. No. 8,931,905 (hereby incorporated by reference) describes a method of testing visual field of a subject on a two-dimensional display screen as shown in
Some of the drawbacks and/or limitations that are associated with the above discussed or existing visual field systems are that: bowl-based systems are typically large in size and lack portability because of which testing has to be performed on-site and under the supervision of an instrument operator; an eye camera or a gaze tracker is often used for verifying whether a subject is looking at a fixation target during examination or not; a response button (e.g., mechanical clicker) is typically used as a primary means of a subject's feedback to a visual stimulus. Other means of subject response, particularly motion related responses are not currently incorporated into such systems; a system has to be physically moved or mechanically adjusted for displaying content specific to the left or the right eye. This is because of varying inter-eye distance (i.e., horizontal and/or vertical distance between the two eyes) among individuals, which is currently not taken into account during the course of the visual field testing. Horizontal inter-eye distance in adults typically varies in a range of 55 to 75 mm, and typically, the non-examined eye (i.e., eye not undergoing test or not being examined) is covered with an eye patch during examination (see for example, U.S. Pat. No. 5,094,524). This eye patch does not allow the non-examined eye to receive the same amount of ambient light as the other eye and therefore it goes through dark and light adaptation cycles, which also influences the examined eye during the testing cycles, resulting in degraded test accuracy.
Here we describe an improved visual testing system and a method for testing a visual field condition of a subject that overcomes the limitations discussed above.
According to one aspect of the subject matter described in the present application, a method of performing a visual field test of a subject includes determining inter-eye distance of the subject; displaying visual stimuli on a left display region and a right display region of a two-dimensional display based on the determined inter-eye distance, said left display region configured to display content specific to the left eye and said right display region configured to display content specific to the right eye of the subject; tracking subject responses to the visual stimuli; evaluating the visual field of each eye of the subject based on the subject responses; and reporting or storing results of the evaluation describing the subject's visual field condition or a further analysis thereof.
Further aspects include various additional features and operations associated with the above and following aspects and may further include, but are not limited to corresponding systems, methods, apparatus, and computer program products.
The invention discussed herein is advantageous in a number of respects. For instance, the invention 1) takes into account different inter-eye distance (horizontal and/or vertical) among individuals for visual field testing without the need for mechanical adjustment, 2) allows 100% control of fixation without a camera, 3) eliminates the need for an eye patch, 4) allows use of inexpensive displays for visual testing by enhancing the usable perceived luminance scale resolution, 5) tracks physical motion (e.g. a slight nod or head movement) as a subject response to a visual stimulus, and 6) presents the test results in a new and intuitive way to the examined person.
The features and advantages described herein are not all-inclusive and many additional features and advantages will be apparent to one of ordinary skill in the art in view of the figures and description. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes and not to limit the scope of the inventive subject matter.
System Overview
In another embodiment, one or more components of the system 300 may automatically determine the inter-pupillary distance (IPD) (also referred to generally as the inter-eye distance) without involving subject's response by displaying a fine grid (the lines must be well distinguishable) to each eye of the subject as shown in
It should be noted that while the subject 314 appears to be front facing in
The horizontal and/or vertical inter-eye distance measurements are processed by a processor 316 to generate visual stimuli and fixation targets on a display screen, such as the display screen 320. The display screen 320, as depicted, includes a left display region 322 for showing content 326 (e.g., fixation target and/or visual stimuli) specific to left eye 306 of the subject 314 and a right display region 324 for showing content 328 specific to the right eye 308 of the subject 314. A fixation target and visual stimuli with respect to the fixation target for each eye are shown in the corresponding display region according to the inter-eye distance 304 and/or vertical positions 310 and 312 of the subject eyes. If a subject's inter-eye distance (horizontal and/or vertical) is not taken into account and/or the deviation from a standard average setting is too large, then the subject may need to concentrate stronger to match the images displayed in the left and right display regions in his/her brain which potentially may lead to headaches and/or nausea.
In the case depicted in
The processor 316 can project the generated fixation targets and/or visual stimuli on the display screen 320 via an optional projector 318 (e.g., a DLP projector) or can present them directly on the display screen 320. For instance, a smartphone includes the processor 316 and the display screen 320 (see for example, the smartphone 800 in
Although a single display screen 320 is shown comprising the two separate regions (i.e., the left display region 322 and the right display region 324), it should be noted that two separate display screens can be used to show content (e.g., visual stimuli) specific to each eye of the subject 314. In some embodiments, the display screen 320 can be comprised within a virtual reality headset that enables the subject 314 to experience content in a virtual reality environment, as discussed in further detail below with respect to
Once the visual stimuli 326 and/or 328 are displayed to the subject 314, a response-capturing device 330 is used to track subject responses to the stimuli i.e., whether the subject observed the stimuli or not. In some embodiments, the response-capturing device 314 is a motion or gyro-sensor that tracks subject body movements (e.g., head movements or head nods) to determine whether a stimulus was seen or not. In other embodiments, the response-capturing device 314 is a traditional clicker or button (e.g., the button 107 in
Subject responses captured by the response-capturing device 330 are sent to the processor 316 for evaluating the subject's visual field and/or defect(s) associated with each eye of the subject 314. In some embodiments, the processor 316 may compare subject responses with baseline, normative, or reference data that include prior response data of the subject or response data of other subjects to assess a trend, change, and/or progression in the subject's visual field condition. In some embodiments, the processor 316 may demonstrate one or more defects associated with each eye of the subject using an image, as discussed in further detail below with respect to
In some embodiments, the eye sensor 302, the processor 316, the display screen 320, and the response-capturing device 330 discussed above are parts of a single computing device which may be used for testing the visual field of a subject in a compact portable form factor. For instance, with reference to
Example Method
In block 406, the method 400 tracks the subject's responses to the displayed visual stimuli. Subject responses may include, for example, a head nod or movement, and/or any other body motion, which can be tracked using a motion sensor, such as the response-capturing device 330. It should be noted that other types of subject responses, such as pushing a response (e.g., mechanical clicker), verbal response, eye movements, etc., are also possible and are within the scope of the present invention.
Based on the subject responses to the visual stimuli, the method in block 408 evaluates the visual field of the subject and then in block 410, reports or stores results of the evaluation describing subject's field condition in a memory (e.g., the memory 1104) or a data store (e.g., the baseline data store 1008) for future access, retrieval, and/or analysis thereof. In some embodiments, the results of the evaluation can be displayed to the subject on the same display screen on which the subject was tested or the results can be sent to a clinician/administrator for display on his/her clinician device 1010 (see
It should be understood that the method described herein is not limited to the steps and/or operations discussed above and that other steps and/or operations are also possible and are within the scope of the present disclosure. It should also be understood that not every step described herein must be performed. Also, it should be noted that these steps or the present invention can be performed autonomously and/or independently without involving a clinician/administrator for administering a visual field test.
The following passages now discuss some additional features of the present invention:
Verification of Subject Attentiveness without a Camera
Verifying that an eye is looking at a predefined fixation target during testing traditionally requires a camera to supervise the eye during the examination. In one embodiment of the present invention, this verification is done without using an eye camera (e.g., eye sensor 302) or a gaze tracker. This may be achieved by first presenting visual stimuli to the subject in an early course of a visual field test and recording a baseline distribution of subject reaction times to those stimuli. Once the baseline distribution of the subject reaction times is established, test stimuli are presented to the blind spot a split second (e.g., a certain number of milliseconds, such as 200 ms) before the regular visual stimuli. If the subject now responds before his range of computed reaction times, it can be concluded that the subject response was given to the blind spot test stimuli instead of the regular visual stimuli and hence the subject was not looking at the predefined fixation target due to malfixation.
Eye-Blink Detection without a Camera
Eye blinks can be detected with a simple photosensor, which, in one embodiment, could be the eye sensor 302, pointing to at least one of the eyes. In some instances, the photosensor may be a separate component, which can be included in the system 300 for detecting eye blinks as discussed herein. Light refraction caused by a closed eyelid is different from the refraction caused when the eyelid is open. This difference can be detected by the photosensor as a means to detect eye-blinks. Detecting eye-blinks with such a photosensor is advantageous for low cost visual field testing systems.
Reducing Troxler and/or Blackout Effects
In some embodiments, the background of each display region of the display screen 320 can be enriched with a pattern or any non-uniform content (see for example, non-uniform backgrounds 502 and 504 in the left display region 322 and the right display region 324, respectively in
Elimination of Eye Patch
In one embodiment of the present invention, fixation targets and/or background images are shown to both eyes of a subject, but visual stimuli are displayed to only one eye. This results in improved fixation behavior for eyes with advanced and central visual field defects. This further prevents the eye from going through an independent light or dark adaptation cycle, as is the case when the non-examined eye is covered with an eye-patch in traditional visual field testing. By way of an example,
Image Based Visual Field Defect Demonstration
In some embodiments, the visual field condition of the subject 314 can be demonstrated using a photograph, an image, and/or a picture. Such an image may show various parts of the image that can be perceived or not perceived by each eye of the subject with his/her respective visual field. By way of an example,
Increasing Dynamic Range of the Display
The human visual system typically has very high contrast sensitivity. Often a single intensity step of 1/256 in an 8 Bit greyscale system can be perceived, especially if the background intensity chosen is in the lower (darker) range of the display scale. True thresholding, however, requires detecting a threshold at which a contrast difference becomes visible/perceivable for the individual being tested. A traditional solution to this problem that has been used in standard perimetry for decades is to change the visual stimulus size (e.g. make the stimulus size smaller if the minimum intensity of a stimulus target can still be perceived on top of a given background intensity). This simply means reducing the number of pixels of the target that the individual needs to detect. In some embodiments, one or more of the following methods can be used to increase the dynamic range of a display, such as the display screen 320:
The above methods can enhance the standard number of greyscales in low cost displays from 8 Bit (256 grey scales) to at least 10 bit and make them suitable for threshold testing.
Visual Field Testing in a Virtual Reality Environment
In some embodiments, the visual field test discussed herein can be performed in a virtual reality (VR) setting where a display, such as the display screen 320, is embedded inside a virtual reality headset, such as a headset 900 shown in
To test a visual field of a headset wearer, the smartphone 800 within the virtual reality headset 900 generates a split-display (such as the split-display shown on the display screen 320 in
Remote Evaluation of Visual Field Condition
It should be understood that the present disclosure is not limited to this configuration and a variety of different system environments and configurations may be employed and are within the scope of the present disclosure. For instance, instead of the smartphone 800, any display device which is capable of generating a split-display (see
As depicted, the smartphone 800 is connected to the network 1002 via signal line 1001 for communicating with the evaluation server 1004 and/or one or more optional clinician devices 1010a through 1010n. For remote evaluation, the smartphone 800 acts as a gateway to relay subject response data (e.g., eye or body (e.g., head, neck, etc.) movements of a subject undergoing testing that are captured as the subject performs an action with respect to a visual stimulus displayed to the subject) via the network 1002 (e.g., WiFi, cellular data network, etc.) to the evaluation server 1004 for it to perform the evaluation. The evaluation server 1004 can be a hardware server that may include an evaluation engine 1006 for evaluating a condition of the subject's visual field based on the subject responses to the visual stimuli and then sending results of the evaluation via the network 1002 to the smartphone 800 or to the one or more clinician device 1010 for display. In some embodiments, the evaluation engine 1006 may perform its evaluation by comparing the subject response data with baseline data 1008 stored in the evaluation server 1004. As discussed elsewhere herein, the baseline data 1008 may include prior response data of a subject currently under testing whose prior data may be used to assess a trend, change, and/or progression in the subject's visual field condition. In some embodiments, the baseline data 1008 may also include response data of one or more other subjects for comparison purposes.
The clinician device(s) 1010 (any or all of 1010a through 1010n) are computing devices having data processing and data communication capabilities. The clinician devices 1010a through 1010n are communicatively coupled to the network 1002 via signal lines 1005a through 1005n respectively to receive results of the evaluation from the evaluation server 1004 or from the smartphone 800, and display them to their respective users. In some embodiments, a clinician device 1010 may be capable of evaluating a visual field condition of a subject by itself. For instance, a clinician device 1010 may receive subject response data from the subject's smartphone 800, process and/or evaluate the response data using a processor (e.g., processor 316) included in the clinician device 1010, and then present results of its evaluation on its display screen for display to its respective user (e.g., clinician, administrator, etc.). In some instances, the user using a clinician device 1010 may be a person who is skilled in the art of visual field testing.
In some embodiments, a clinician device 1010 is a smartphone (indicated by reference numeral 1010a), a laptop computer (as indicated by reference numeral 1010n), or any of a desktop computer, a netbook computer, a tablet, smartwatch, etc. It should be understood that the one or more clinician devices are shown with the dotted lines to indicate that these are optional and may not be part of the system 1000.
Example Computing Device
As depicted, the various components of the computing device 1100 are communicatively coupled by a communication bus 1108. The bus 1108 can include a conventional communication bus for transferring data between components of a computing device or between computing devices. It should be noted that some of these components have already been discussed above and the description for these will not be repeated here.
The processor 316 may execute various hardware and/or software logic, such as software instructions, by performing various input/output, logical, and/or mathematical operations. The processor 316 may have various computing architectures to process data signals including, for example, a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, and/or architecture implementing a combination of instruction sets. The processor 316 may be physical and/or virtual, and may include a single core or plurality of processing units and/or cores. In some embodiments, the processor 316 may be capable of generating and providing electronic display signals to a display device, such as the smartphone 800, supporting the display of images, capturing and transmitting images, performing complex tasks including various types of feature extraction and sampling, etc. As discussed elsewhere herein, the processor 316 may be configured to evaluate a subject's visual field based on subject responses and further configured to report or store results of the evaluation describing the subject's field condition in a data store. In some embodiments, the processor 316 may be coupled to the memory 1104 via a data/communication bus to access data and instructions therefrom and store data therein. The bus 1108 may couple the processor 316 to the other components of the computing device 1100.
The memory 1104 may store instructions and/or data that may be executed by the processor 316. In some embodiments, the memory 1104 stores at least the mobile application 1110, the evaluation engine 1006, and the baseline data 1008. In some embodiments, the memory 1104 may also be capable of storing other instructions and data including, for example, an operating system, hardware drivers, other software applications, databases, etc. The memory 1104 is coupled to the bus 1108 for communication with the processor 316 and other components of the computing device 1100. The memory 1104 may include a non-transitory computer-usable (e.g., readable, writeable, etc.) medium, which can be any apparatus or device that can contain, store, communicate, propagate or transport instructions, data, computer programs, software, code, routines, etc. for processing by or in connection with the processor 316. A non-transitory computer-usable storage medium may include any and/or all computer-usable storage media. In some embodiments, the memory 1104 may include volatile memory, non-volatile memory, or both. For example, the memory 1104 may include a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, a flash memory, a hard disk drive, a floppy disk drive, a CD ROM device, a DVD ROM device, a DVD RAM device, a DVD RW device, a flash memory device, or any other mass storage device known for storing instructions on a more permanent basis.
In some embodiments, one or more of the smartphone 800, the evaluation server 1004, and the one or more clinician devices 1010 are located at the same or different locations. When at different locations, these components may be configured to communicate with one another through a wired and/or wireless network communication system, such as the communication unit 1106. The communication unit 1106 may include network interface devices (I/F) for wired and wireless connectivity. For example, the communication unit 1106 may include a CAT-type interface, USB interface, or SD interface, transceivers for sending and receiving signals using Wi-Fi™; Bluetooth®, or cellular communications for wireless communication, etc. The communication unit 1106 may be coupled to the network 1002 via the signals lines 1001, 1003, and 1005. The communication unit 1106 can link the processor 316 to a computer network, such as the network 1002 that may in turn be coupled to other processing systems.
The mobile application 1110 is storable in the memory 1104 and executable by the processor 316 of the smartphone 800 and/or a clinician device 1010 to provide for user interaction, receive user input, present information to the user, and send data to and receive data from the other entities of the system 1000 via the network 1002. In some embodiments, the mobile application 1110 may generate and present user interfaces (e.g., on the display screen 320) based at least in part on information received from the processor 316 and/or the evaluation server 1004. For example, a user/clinician may use the mobile application 1110 to receive results of an evaluation computed by the evaluation server 1004 on his/her clinician device 1010.
In the above description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the specification. It should be apparent, however, that the subject matter of the present application can be practiced without these specific details. It should be understood that the reference in the specification to “one embodiment”, “some embodiments”, or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the description. The appearances of the phrase “in one embodiment” or “in some embodiments” in various places in the specification are not necessarily all referring to the same embodiment(s).
Furthermore, the description can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The foregoing description of the embodiments of the present subject matter has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the present embodiment of subject matter to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the present embodiment of subject matter be limited not by this detailed description, but rather by the claims of this application. As will be understood by those familiar with the art, the present subject matter may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Furthermore, it should be understood that the modules, routines, features, attributes, methodologies and other aspects of the present subject matter can be implemented using hardware, firmware, software, or any combination of the three.
This application is a National Phase application under 35 U.S.C. § 371 of International Application No. PCT/EP2017/059450, filed Apr. 20, 2017, which claims priority to U.S. Provisional Application Ser. No. 62/326,332, filed Apr. 22, 2016, the contents of which are hereby incorporated by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2017/059450 | 4/20/2017 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/182596 | 10/26/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5026151 | Waltuck et al. | Jun 1991 | A |
5094524 | Fuhr | Mar 1992 | A |
5151722 | Massof et al. | Sep 1992 | A |
5550602 | Braeuning | Aug 1996 | A |
5565949 | Kasha, Jr. | Oct 1996 | A |
5737060 | Kasha, Jr. | Apr 1998 | A |
5864384 | McClure et al. | Jan 1999 | A |
5880812 | Solomon | Mar 1999 | A |
5898474 | McClure et al. | Apr 1999 | A |
5910834 | McClure et al. | Jun 1999 | A |
5912723 | Maddess | Jun 1999 | A |
5920374 | Vaphiades et al. | Jul 1999 | A |
5956124 | Dan | Sep 1999 | A |
6027217 | McClure et al. | Feb 2000 | A |
6033076 | Braeuning et al. | Mar 2000 | A |
6045227 | Stewart et al. | Apr 2000 | A |
6145991 | McClure et al. | Nov 2000 | A |
6149272 | Bergner et al. | Nov 2000 | A |
6290357 | Massengill et al. | Sep 2001 | B1 |
6386706 | McClure et al. | May 2002 | B1 |
6494578 | Plummer et al. | Dec 2002 | B1 |
D472637 | Cooper et al. | Apr 2003 | S |
6592222 | Massengill et al. | Jul 2003 | B2 |
7367671 | Sabel | May 2008 | B2 |
7446941 | Fukuda | Nov 2008 | B2 |
7448751 | Kiderman et al. | Nov 2008 | B2 |
7486341 | Hong et al. | Feb 2009 | B2 |
7682021 | Sabel | Mar 2010 | B2 |
7740592 | Graham et al. | Jun 2010 | B2 |
7753524 | Sabel | Jul 2010 | B2 |
7972278 | Graham et al. | Jul 2011 | B2 |
8075134 | Tanassi et al. | Dec 2011 | B2 |
8333475 | Sugio et al. | Dec 2012 | B2 |
8371696 | Johansson | Feb 2013 | B2 |
8550626 | Griggio | Oct 2013 | B2 |
8568311 | LaPlaca et al. | Oct 2013 | B2 |
8668334 | Krenik | Mar 2014 | B2 |
8696126 | Yoo et al. | Apr 2014 | B2 |
8931905 | Lewis | Jan 2015 | B2 |
8950864 | Massengill | Feb 2015 | B1 |
20030158497 | Graham et al. | Aug 2003 | A1 |
20070200927 | Krenik | Aug 2007 | A1 |
20090153796 | Rabner | Jun 2009 | A1 |
20100118264 | Sabel | May 2010 | A1 |
20100292999 | Verma | Nov 2010 | A1 |
20110205167 | Massengill | Aug 2011 | A1 |
20110267577 | Verma | Nov 2011 | A1 |
20130194389 | Vaught et al. | Aug 2013 | A1 |
20130285885 | Nowatzyk et al. | Oct 2013 | A1 |
20130308099 | Stack | Nov 2013 | A1 |
20140085282 | Luebke et al. | Mar 2014 | A1 |
20140192326 | Kiderman et al. | Jul 2014 | A1 |
20160007849 | Krueger | Jan 2016 | A1 |
20170290505 | Correns et al. | Oct 2017 | A1 |
Number | Date | Country |
---|---|---|
4326760 | Mar 1995 | DE |
19502337 | Aug 1996 | DE |
19540802 | May 1997 | DE |
4103829 | Jun 2008 | JP |
4169881 | Oct 2008 | JP |
2009-268778 | Nov 2009 | JP |
2011-224213 | Nov 2011 | JP |
5141522 | Feb 2013 | JP |
2016-154149 | Aug 2016 | JP |
2006012679 | Feb 2006 | WO |
2010117386 | Oct 2010 | WO |
2015120438 | Aug 2015 | WO |
2016046202 | Mar 2016 | WO |
Entry |
---|
Chan, Adrian Dart Cheong, “Head-Mounted Perimetry”, Master Thesis Submission, Institute of Biomaterials and Biomedical Engineering University of Toronto, 1999, 135 pages. |
Fidopiastis et al., “Quantitative Assessment of Visual Acuity in Projective Head Mounted Displays”, Proceedings of SPIE, vol. 5079, 2003, pp. 399-406. |
Hollander et al., “Use of a Portable Head Mounted Perimetry System to Assess Bedside Visual Fields”, The British Journal of Ophthalmology, vol. 84, 2000, pp. 1185-1190. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/EP2017/050959, dated Aug. 2, 2018, 8 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/EP2017/059450, dated Nov. 1, 2018, 7 pages. |
International Search Report and Written Opinion Received for PCT Application No. PCT/EP2017/050959, dated Apr. 6, 2017, 10 pages. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/EP2017/059450, dated Jul. 21, 2017, 10 pages. |
Lanman et al., “Near-Eye Light Field Displays”, ACM Transactions on Graphics, vol. 32, No. 6, Nov. 2013, 10 pages. |
Tel Aviv University, “Virtual Reality Goggles Create an Equal Opportunity Eye Test”, ScienceDaily, Available Online at <www.sciencedaily.com/releases/2008/08/080808104929.htm>, Aug. 13, 2008, pp. 1-3. |
Wroblewski et al., “Development and Testing of a Novel Head-Mounted, Eye-Tracking Perimeter with Virtual Reality Visor”, Doheny Eye Institute and BioFormatix Inc., Presented at American Glaucoma Society Meeting, 2011, 1 page. |
Number | Date | Country | |
---|---|---|---|
20190142270 A1 | May 2019 | US |
Number | Date | Country | |
---|---|---|---|
62326332 | Apr 2016 | US |