This application is directed to medical equipment. In particular, this application is directed to a vision screening device, and associated systems and methods, for assessment of color vision and detection of color vision disorders.
Visual screening in children and adults typically includes one or more tests to determine various deficiencies associated with the patient's eyes. Such vision tests may include, for example, refractive error tests, accommodation tests, visual acuity tests, color vision screening and the like. Conventional color vision screening tests present printed pseudoisochromatic plates (e.g., Ishihara or modified Ishihara plates) that embed numerals, characters of the alphabet, or shapes to the patient. The patient is expected to provide feedback on the visibility of the embedded content in the plates. The color vision status of a patient is determined based on the patient's feedback. The color vision status may include normal color vision, or the presence of a color vision deficiency. The type of color vision deficiency may also be determined from the test results.
Color vision may also be screened using a specialized device called anomaloscope. This device is also based on a patient's ability to provide feedback on whether two colors match. This method is typically used for further clinical evaluation of patients who have been identified as having a color vision deficiency based on the pseudoisochromatic plates-based tests.
The conventional color vision screening tests rely upon a patient's feedback upon viewing the color stimuli being presented to them. As a result, these tests are unsuitable for subjects who are unable to communicate or cooperate due to young age or a disability, as well as other uncooperative patients. Some studies have shown that approximately 8% of males and 0.5% of females exhibit some form of color vision deficiency. Though most types of color vision deficiency cannot be corrected, color vision screening is commonly administered so that the patient is made aware of any color vision deficiency, and can adjust for the impact of the deficiency on learning and performance of certain tasks.
Most vision tests are executed using a vision screening device, which may include ophthalmic testing devices such as a phoropter, autorefractor and photo-refractors. The use of printed or digitally displayed plates to screen for color vision requires an additional step in the vision screening process that adds to the time and complexity of the screening. It would be advantageous to be able to screen for most vision problems using a single integrated device.
The various examples of the present disclosure are directed toward overcoming one or more of the deficiencies noted above.
In an example of the present disclosure, a vision screening device includes a first radiation source configured to generate color stimuli, a second radiation source separate from the first radiation source, and a sensor configured to capture radiation emitted by the second radiation source, and reflected by an eye of a patient. The vision screening device also includes a processor operably connected to the first radiation source, the second radiation source, and the sensor, and a memory storing instructions executable by the processor. The instructions when executed, cause the processor to cause the first radiation source to present a first color stimulus, and a second color stimulus different from the first color stimulus, to the patient during a period of time, cause the second radiation source to emit near-infrared radiation during the period of time, and cause the sensor to capture, during the period of time, first near-infrared radiation reflected by the eye and responsive to the first color stimulus, and second near-infrared radiation reflected by the eye and responsive to the second color stimulus. The instructions, when executed, also cause the processor to determine a first measurement associated with the eye and based on the first near-infrared radiation, determine a second measurement associated with the eye and based on the second near-infrared radiation, determine a difference between the first measurement and the second measurement, determine that the difference is less than a threshold value; and generate, based at least in part on determining that the difference is less than the threshold value, a recommendation associated with the patient.
In another example of the present disclosure, a vision screening device includes a housing, a first display disposed on a first side of the housing and configured to generate color stimuli, a second display disposed on a second side of the housing opposite the first side, a radiation source configured to emit near-infrared (NIR) radiation, and a sensor configured to capture NIR radiation, emitted by the radiation source, and reflected by an eye of a patient. The vision screening device also includes a processor operably connected to the first display, the second display, the radiation source, and the sensor, and memory storing instructions that, when executed by the processor, cause the processor to cause the first display to present a first color stimulus, and a second color stimulus different from the first color stimulus, to the patient during a period of time, cause the radiation source to emit near IR during the period of time, and cause the sensor to capture, during the period of time, first near-infrared radiation reflected by the eye and responsive to the first color stimulus, and second near-infrared radiation reflected by the eye and responsive to the second color stimulus. The instructions, when executed, also cause the processor to determine a first refractive error based on the first near-infrared radiation, determine a second refractive error based on the second near-infrared radiation, determine a difference between the first refractive error and the second refractive error, determine that the difference is less than a threshold value, and generate, based at least in part on determining that the difference is less than the threshold value, a recommendation associated with the patient.
In still another example of the present disclosure, a method includes causing a first radiation source to present a first color stimulus, and a second color stimulus different from the first color stimulus, to a patient during a period of time, causing a second radiation source to emit near-infrared radiation during the period of time, and causing a sensor to capture, during the period of time, first data indicative of near-infrared radiation reflected by an eye of a patient and responsive to the first color stimulus, and second data indicative of near-infrared radiation reflected by the eye of the patient and responsive to the second color stimulus. The method also includes determining a first measurement associated with the eye and based on the first data, determining a second measurement associated with the eye and based on the second data, determining a difference between the first measurement and the second measurement, determining that the difference is less than a threshold value; and generating, based at least in part on determining that the difference is less than the threshold value, a recommendation associated with the patient.
Features of the present disclosure, its nature, and various advantages, may be more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings.
In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items or features. The drawings are not to scale.
The present disclosure is directed to, in part, a vision screening device, and corresponding methods. Such an example vision screening device may be configured to perform one or more vision screening tests on a patient and to output the results of the vision screening test(s) to an operator of the device, such as a physician or a physician's assistant. Specifically, the present disclosure is directed to devices and methods for a color vision screening test. For example, the vision screening device may generate one or more color stimuli, such as a series of color dots, color images, figures with color fringing, or other items useful for testing color vision capability of the patient. While the patient is viewing the color stimuli, the device may determine one or more measurements, such as a refractive error, pupil position, pupil size, and/or gaze angle, associated with one or more eyes of the patient.
Based at least in part on the measurements determined while the patient is viewing the color stimuli, the device may generate an output including at least one of a recommendation or a diagnosis associated with the patient. Such an output (e.g., the recommendation and/or the diagnosis) may be indicative of the color vision capability of the patient, e.g., whether the patient has normal color vision, requires additional screening, the type of color vision deficiency exhibited by the patient, and the like. For example, the device may compare the measurement(s) determined during the color vision screening test to standard testing data corresponding to normal color vision to provide the recommendation and/or diagnosis. In particular, the standard testing data may provide one or more thresholds or a range of values, and the output generated by the device may be based on the measurement(s) being less than the threshold(s) or being within the range of values. The output generated by the device may be displayed to the operator of the vision screening device. As such, the methods described herein may provide an automated diagnosis to assist the physician or other user of the vision screening device. The methods described herein may also provide an automated recommendation based on and/or indicative of such a diagnosis.
As will be described with respect to at least
Additional details pertaining to the above-mentioned devices and techniques are described below with reference to
As described herein, the vision screening device 104 may be configured to perform the color vision screening test on the patient 106. In examples, the color vision screening test may include displaying, in sequential order, a plurality of color stimuli, such as colored light patterns or images, configured to elicit an ocular response that causes a change in one or more measurements associated with eye(s) of the patient 106. In examples, the measurement(s) may comprise a measurement of refractive error of the eye(s) of the patient 106. For example, U.S. Pat. No. 9,237,846, the entire disclosure of which is incorporated herein by reference, describes systems and methods for determining refractive error based on photorefraction using pupil images captured under different illumination patterns generated by near-infrared (NIR) radiation sources. During the administration of the color vision screening test, the vision screening device 104 may detect pupils, retinas, and/or lenses of the eye(s) of the patient 106, acquire data comprising images and/or video data of the pupils/retinas/lenses, and the like. This data may also be used to determine a measurement of gaze angle or gaze direction of the eye(s) of the patient, which may be tracked over time to determine a pattern of gaze angles or gaze directions during the color vision screening test. The vision screening device 104 may transmit the data, via a network 108, to a vision screening system 110 for analysis to determine an output 112 associated with the patient 106. Alternatively, or in addition, the vision screening device 104 may perform some or all of the analysis locally to determine the output 112. Indeed, in any of the examples described herein, some or all of the disclosed methods may be performed in whole or in part by the vision screening device 104, or by the vision screening system 110 separate from the vision screening device 104. For instance, in such examples, the vision screening device 104 may be configured to perform any of the vision screening tests, color vision tests, and/or other methods described herein without being connected to, or otherwise in communication with, the vision screening system 110 via the network 108.
As shown schematically in
The vision screening device 104 may also include one or more radiation sensor(s) 116, such as cameras, configured to capture reflected radiation from the eye(s) of the patient during the vision screening test(s), including the color vision screening test. For example, the vision screening device 104 may emit, via the radiation source(s) 114, one or more beams of radiation, and may be configured to direct such beams at the eye(s) of the patient 106. The vision screening device 104 may then capture, via the radiation sensor(s) 116, corresponding radiation that is reflected back e.g., from pupils of the eye(s). Data corresponding to the reflected radiation captured by the radiation sensor(s) 116 may be used for determining the refractive error(s) and/or gaze angle(s) associated with the eye(s) of the patient 106. In examples, the radiation sensor(s) 116 may comprise NIR radiation sensor(s) to capture reflected NIR radiation while the eye(s) of the patient 106 are illuminated by the NIR radiation source(s) 114. The data captured by the NIR radiation sensor(s) 116 may be used in the measurement of the refractive error and/or gaze angle(s) of the eye(s) of the patient 106. The data may include images and/or video of the pupils, retinas, and/or lenses of the eyes of the patient 106. The data may be captured intermittently, during specific periods of the color vision or other vision screening test, or during the entire duration of the test(s). Additionally, the vision screening device 104 may process the image(s) and/or video data to determine change(s) in the refractive error and/or gaze angle(s) of the eye(s) of the patient 106. In examples, the reflected radiation from the patient's pupils may pass through a view window facing the patient 106, to reach the radiation sensor(s) 116.
The vision screening device 104 may also include one or more display screen(s) 118, such as a color LCD (liquid crystal display), or an OLED screen. The display screen(s) 118 may include an operator display screen facing a direction towards the operator 102, configured to provide information related to the vision screening tests to the operator 102. In any of the examples described herein, the operator display screen 118 facing the operator 102 may be configured to display and/or otherwise provide the output 112 generated by the vision screening device 104 and/or generated by the vision screening system 110. The output 112 may include testing parameters, current status of the test, measurements(s) determined during the test, a diagnosis determined based on one or more tests, and/or a recommendation associated with the diagnosis. The display screen 118 facing the operator 102 may also display information related to or unique to the patient, and the patient's medical history.
In some examples, the display screen(s) 118 may also include one or more display screens facing in a direction towards the patient 106, and configured to display the plurality of color stimuli to the patient 106. The color stimuli may be provided to the patient by images displayed on the display screen(s) 118 facing the patient. The display screen(s) 118 for providing the color stimuli may be integrated with the vision screening device 104, or may be external to the device, and under computer program control of the device. Additional details of the display screen(s) 118 associated with the vision screening device 104, will be discussed with reference to at least
The vision screening device 104 may transmit the data captured by the radiation sensor(s) 116, via the network 108, using network interface(s) 120 of the vision screening device 104. In addition, the vision screening device 104 may also similarly transmit other testing data associated with the vision screening test(s) being administered, e.g., type of test, duration of test, patient identification and the like. The network interface(s) 120 of the vision screening device 104 may be operably connected to one or more processor(s) of the vision screening device 104, and may enable wired and/or wireless communications between the vision screening device 104 and one or more components of the vision screening system 110, as well as with one or more other remote systems and/or other networked devices. For instance, the network interface(s) 120 may include a personal area network component to enable communications over one or more short-range wireless communication channels, and/or a wide area network component to enable communication over a wide area network. In any of the examples described herein, the network interface(s) 120 may enable communication between, for example, the processor(s) 122 of the vision screening device 104, and the vision screening system 110, via the network 108. The network 108 shown in
The vision screening system 110 may be configured to receive data, from the vision screening device 104 and via the network 108, collected during the administration of the vision screening test(s), such as the color vision screening test. In some examples, based at least in part on processing the data, the vision screening system 110 may determine the output 112 associated with the patient 106. For example, the output 112 may include a recommendation and/or diagnosis associated with color vision capability of patient 106, based on an analysis of the data indicative of the refractive error(s) and/or gaze angle(s) associated with the eye(s) of the patient 106 in response to the plurality of color stimuli presented during the color vision test. The vision screening system 110 may communicate the output 112 to the one or more processor(s) 122 of the vision screening device 104 via the network 108. As noted above, in any of the examples described herein one or more such recommendations, diagnoses, or other outputs may be generated, alternatively or additionally, by the vision screening device 104.
As described herein, a processor, such as processor(s) 122, can be a single processing unit or a number of processing units, and can include single or multiple computing units or multiple processing cores. The processor(s) 122 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. For example, the processor(s) 122 can be one or more hardware processors and/or logic circuits of any suitable type specifically programmed or configured to execute the algorithms and processes described herein. As shown schematically in
The computer-readable media 124 may can include volatile and nonvolatile memory and/or removable and non-removable media implemented in any type of technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data. Such computer-readable media 124 can include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, optical storage, solid state storage, magnetic tape, magnetic disk storage, RAID storage systems, storage arrays, network attached storage, storage area networks, cloud storage, or any other medium that can be used to store the desired information and that can be accessed by a computing device. The computer-readable media 124 can be a type of computer-readable storage media and/or can be a tangible non-transitory media to the extent that when mentioned, non-transitory computer-readable media exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
The computer-readable media 124 can be used to store any number of functional components that are executable by the processor(s) 122. In examples, these functional components comprise instructions or programs that are executable by the processor(s) 122 and that, when executed, specifically configure the one or more processor(s) 122 to perform actions associated with one or more of the vision screening tests, such as the color vision screening test. For example, the computer-readable media 124 may store one or more functional components for administering the color vision screening test, such as a patient data component 126, a patient screening component 128, an emitter control component 130, a measurement component 132, a data analysis component 134, and/or an output generation component 136, as illustrated in
In examples, the patient data component 126 may be configured to store and/or access data associated with the patient 106. For example, the patient 106 may provide data, such as patient data 138, upon initiating a vision screening test. For instance, when the vision screening device 104 and/or vision screening system 110 initiates a vision screening test, the patient 106 may provide, or the operator 102 may request, the patient data 138 regarding the patient's demographic information, disability information, preferences, and the like. For example, the patient 106 may provide demographic information such as name, age, ethnicity, and the like. In such examples, the operator 102 may request the data while the screening is in progress, or before the screening has begun. In some examples, the operator 102 may be provided with predetermined categories associated with the patient 106, such as predetermined age ranges (e.g., six to twelve months, one to five years old, etc.), and may request the patient data 138 in order to select the appropriate category associated with the patient 106. In other examples, the operator 102 may provide a free form input associated with the patient data 138. In still further examples, an input element may be provided to the patient 106 directly.
Alternatively, or in addition, the vision screening device 104 and/or vision screening system 110 may determine and/or detect the patient data 138 during the vision screening test. For example, the vision screening device 104 may be configured to generate image and/or video data associated with the patient 106 at the onset of the vision screening test. For example, the vision screening device 104 may include one or more digital cameras, motion sensors, proximity sensors, or other image capture devices configured to collect images and/or video data of the patient 106, and one or more processors of the vision screening device 104 may analyze the data to determine the patient data 138, such as the distance of the patient 106 from the screening device. For example, the vision screening device 104 may be equipped with a range finder, such as an ultra-sonic range finder, an infrared range finder, and/or any other proximity sensor that may be able to determine the distance of the patient 106 from the screening device.
Alternatively, or in addition, the vision screening device 104 may be configured to transmit the images/video data to the vision screening system 110, via the network 108, for analysis to determine the patient data 138. For example, the vision screening device 104 may transmit the image/video data to the vision screening system 110 and the patient data component 126 may be configured to analyze the data to determine the patient data 138. Still further, the patient data component 126 may be configured to receive, access, and/or store the patient data 138 associated with the patient 106 and/or additional patients. For example, the patient data component 126 may store previous patient information associated with the patient 106 and/or other patients who have utilized the vision screening system 110. For instance, the patient data component 126 may store previous patient preferences, screening history, and the like. The patient data component 126 may receive the patient data 138 and/or may access such information via the network 108. For example, the patient data component 126 may access an external database, such as screening database 140, storing data associated with the patient 106 and/or other patients. The screening database 140 may be configured to store the patient data 138 stored in association with a patient ID. When the operator 102 and/or patient 106 enters the patient ID, the patient data component 126 may access or receive the patient data 138 stored in association with the patient ID and the patient 106.
In examples, the computer-readable media 124 may also store a patient screening component 128. The patient screening component 128 may be configured to determine the plurality of color stimuli for display, in a sequence, to the patient 106, via the vision screening device 104, during the color vision screening test. A library of color stimuli for use the color vision screening test may be stored in the screening database 140 or be available on the computer-readable media 124, 144. The color stimuli may be based on psychophysical characteristics of human color vision, and configured to elicit differences in ocular response between patients with normal color vision and patients exhibiting types of color vision deficiency. The library may include different types of color stimuli, such as black figures including a color fringe displayed on a white background, graphics of a first color on a background of a second color different from the first color, color dot patterns, time-varying color images and the like.
The patient screening component 128 may be configured to receive and/or access the patient data 138 from the patient data component 126 to determine the color stimuli to display to the patient 106. As an example, the patient screening component 128 may utilize the patient data 138 to determine a testing category that the patient 106 belongs to (e.g., a testing category based on age, disability, etc.). Based on the patient data 138, and/or the testing category, the patient screening component 128 may determine the plurality of color stimuli for display to the patient 106 from the library of color stimuli. For example, if the patient data 138 indicates that the patient is a toddler, the color stimuli may comprise color dot graphics, whereas if the patient data 138 indicates an older patient who is able to read, the color stimuli may include text on a white background exhibiting various color fringing or colored text on a different colored background.
In any of the examples described herein, the patient screening component 128 may determine a sequence of color stimuli that transition between colors that the patient may find hard to distinguish. The plurality of color stimuli may follow a sequence that checks for different types of color vision deficiency that may be exhibited by the patient. For example, there are two main types of red-green color blindness, Protan and Deutan. A patient with protanomaly, or Protan type color-blindness, may have difficulty perceiving differences between red and black. Such a patient may also fail to perceive the color purple and the color pink. For example, to detect protanomaly, the patient screening component 128 may include a color stimulus that presents red graphics against a black background. In another example, the patient screening component 128 may include a purple dot pattern, followed by a pink dot pattern color stimulus in the plurality of color stimuli. In a patient with protanomaly, there may be small or no measurable change in the ocular response as these colors are not easily distinguishable by the patient. In a patient with deuteranomaly, or Deutan type color-blindness, the colors green, yellow, orange, red and brown may appear similar, especially in low light. It can also be difficult to differentiate between blues and purples, or pinks and grays. Based on these known symptoms of deuteranomaly, the patient screening component 128 may determine a sequence of color stimuli that transition between colors that the patient may find hard to distinguish e.g., a blue color stimulus to a purple color stimulus, in order to detect the condition. Other conditions may be similarly screened for by including in the plurality of color stimuli, a color stimulus, or a sequence of color stimuli, that would cause a response that is different from normal color vision in a patient exhibiting the condition. For example, tritanomaly may be screened for by transitions from blue to green, and transitions from red to purple. The patient screening component 128 may include a color stimulus or a sequence of color stimuli for each type of color vision deficiency being screened for. The patient screening component 128 may determine the sequence, and a duration of display for each color stimulus of the plurality of color stimuli.
In some examples, the color vision capability of the patient may be determined based on changes in the refractive error of the patient's eye(s) in response to changes in color stimuli being presented to the patient. The normal human eye focuses light in the red, blue and green channels on different parts of the eye, using the differences in focus of the three color channels to determine if the eye is in focus, and adjusting the focusing mechanisms of the eye as needed. Therefore, in a patient of normal color vision, the refractive error measurement(s) of the eye may change when subjected to a change in color stimuli designed to elicit such a change in refractive error (e.g., the refractive error may change by 0.25 or 0.5 diopters). Alternatively, or in addition, the gaze angle of the eye(s) may be indicative of the color vision capability the patient. The gaze angle of the eye(s) can be tracked to evaluate if the patient can see a color stimulus being presented. For example, if the patient is able to perceive different features in the color stimuli being presented, they may fixate on the features following a pattern of gaze angles. However, if the patient has some form of color vision deficiency, their eyes may not be able to perceive differences between different color stimuli, and therefore, exhibit no change or limited change in the refractive error measurement(s), and may exhibit a random pattern of gaze angles or shorter gaze time. The changes in refractive error measurement(s) may be compared with threshold(s) and/or range of values from standard testing data, as discussed, to determine whether the changes indicate a normal color vision response. The type of color vision deficiency (e.g., Protan or Deutan red-green color blindness, Tritanomaly blue color blindness, etc.) may also be determined from the specific change(s) in color stimuli for which the change in refractive error measurement(s) in the patient's eye(s) are less than the threshold(s), or outside the range of values, for normal color vision. Similarly, the pattern of gaze angles exhibited by the patient may be compared with standard gaze angle patterns corresponding to a normal color vision response, to determine whether the pattern of gaze angles indicate a normal color vision response.
In some examples, the computer-readable media 124 may additionally store an emitter control component 130. The emitter control component 130 may be configured to operate the radiation source(s) 114 of the vision screening device 104. As discussed, the radiation source(s) 114 may include NIR LEDs for measuring the refractive error and/or gaze angle of the eye(s) of the patient 106, and/or color LEDs for generating the color stimuli for display to the patient 106. In examples, the emitter control component 130 may generate commands to operate and control the individual radiation sources, such as LEDs, of the color LEDs and the NIR LEDs. Control parameters of the LEDs may include intensity, duration, pattern and cycle time. For example, the commands may selectively activate and deactivate the individual color LEDs of the radiation sources 114 to produce the plurality of color stimuli indicated by the patient screening component 128. In alternative examples, where the plurality of color stimuli is provided by images displayed on display screen(s) 118 visible to the patient 106, the emitter control component 130 may control the intensity, duration, pattern and cycle time of the images displayed on the display screen. The emitter control component 130 may activate the NIR LEDs of the radiation source(s) 114 used for measuring the refractive error and/or gaze angle of the eye(s) of the patient 106 in synchronization with the presentation of the color stimuli during the performance of the color vision screening test.
The individual radiation sources, such as LEDs, of the radiation source(s) 114 may be controlled by the emitter control component 130 according to control parameters stored in the computer-readable media 124. For instance, control parameters may include intensity, duration, pattern, cycle time, and so forth, for the color LEDs and/or the NIR LEDs of the radiation source(s) 114. For example, with respect to intensity, the control parameters may direct the color LEDs to emit light that is bright enough to attract attention of the patient 106, while also limiting brightness to avoid pupil constriction. Further, the emitter control component 130 may use the control parameters to display color stimuli such as color dot patterns to the patient 106 using the color LEDs of the radiation source(s) 114. These patterns may include circular patterns, alternating light patterns, flashing patterns, patterns of shapes such as circles or rectangles, and the like. The emitter control component 130 may also use the control parameters to determine a duration that individual LEDs of the radiation source(s) 114 emit radiation (e.g., 50 milliseconds, 100 milliseconds, 200 milliseconds, etc.). Additionally, the emitter control component 130 may utilize the control parameters to alter an intensity and display pattern of NIR LEDs of the radiation source(s) 114 for the determination of refractive error of the eye(s) based on photorefraction and/or gaze angle of the eye(s).
In some examples, the computer-readable media 124 may additionally store a measurement component 132. In such examples, the measurement component 132 may be configured to activate the radiation sensor(s) 116 of the vision screening device 104 and thereby cause the radiation sensor(s) 116 to capture data. The measurement component 132 may also be configured to analyze the data collected, detected, and/or otherwise captured by components of the vision screening device 104 (e.g., by the radiation sensor(s) 116) during one or more vision screening tests. The measurement component 132 may analyze the data to determine one or more measurements associated with the patient 106, such as an accommodation of a lens of the eyes of the patient 106, motion information associated with the eyes of the patient 106, the refractive error of the eye(s) of the patient 106, gaze angle of the eye(s) of the patient 106, and the like. For example, U.S. patent application Ser. No. 16/522,028, filed on Jul. 25, 2019, and incorporated herein in its entirety, describes systems and methods for evaluating vision of a patient using image/video data captured by a sensor of a vision screening device. Any of the methods described in U.S. patent application Ser. No. 16/522,028 may be performed by the measurement component 132 in examples of the present disclosure.
For example, the measurement component 132 may be configured to receive, from the vision screening device 104, data, such as image data and/or video data captured by the radiation sensor(s) 116 during the color vision screening test. The data may be acquired while the plurality of color stimuli is being presented to the patient 106 by the vision screening device 104. In some examples, the measurement component 132 may determine the refractive error of one or both eyes of the patient 106, and/or a change in the refractive error of the eye(s), while the patient 106 is viewing each of the plurality of color stimuli. For example, the plurality of color stimuli displayed during the color vision screening test may include a first color stimulus, followed in the sequence by a second color stimulus. The measurement component 132 may receive, as part of the data, a first data captured during the period of presentation of the first color stimulus, and a second data captured during the period of presentation of the second color stimulus. Additionally, the data may include a third data that is captured during the transition period between the presentation of the first color stimulus and the second color stimulus. The measurement component 132 may determine the refractive error of the eye(s) based on the first data, and the refractive error of the eye(s) based on the second data, and may also determine a change in the refractive error responsive to the change in color stimuli from the first stimulus to the second stimulus. Alternatively, or in addition, the third data captured during the transition from the first color stimulus to the second color stimulus may be used to determine a change in the refractive error due to the change in color stimuli.
In another example, the measurement component 132 may be configured to determine a gaze direction or gaze angle of the eye(s) of the patient 106 in response to viewing the plurality of color stimuli being presented to the patient 106. For example, the gaze angle of the patient 106 may be determined by activating the radiation source(s) 114, such as the NIR LEDs, and directing the radiation emissions in the direction of the patient's 106 eye(s). In response, the cornea of the patient's 106 eye(s) may reflect the radiation, and the reflected radiation may be captured by the radiation sensor(s) 116. The measurement component 132 may utilize the image data and/or video data captured by the radiation sensor(s) 116 to determine a glint, or straight-line measurement, from the source of the light to the center of the eye (e.g., the origin of the reflection). As such, the measurement component 132 may utilize this information to determine a position, location, and/or motion of the pupil at different points in time during the presentation of the plurality of color stimuli. In other examples, the measurement component 132 may utilize the image/video data to determine the position or location of the pupil relative to the outside edges of the eye (e.g., the outline of the eye). The measurement component 132 may utilize the position and motion of the pupils of the eye(s) to determine the gaze angle of the eye(s) of the patient 106 during the presentation of the plurality of color stimuli.
Further, the computer-readable media 124 may also store a data analysis component 134. The data analysis component 134 may be configured to receive, access, and/or analyze standard testing data associated with vision testing. For example, the data analysis component 134 may be configured to access or receive data from one or more additional databases (e.g., the screening database 140, a third-party database, etc.) storing testing data, measurements, and/or values indicating various thresholds or ranges within which testing values should lie. Such thresholds or ranges may be associated with patients having normal vision health with similar testing conditions, and may be learned or otherwise determined from standard testing. The data analysis component 134 may utilize the standard testing data for comparison against the measurement(s) generated by the measurement component 132 described above. For example, the standard testing data associated with the color vision screening test may indicate a threshold or a range, where the change in the refractive error in the patient's eye(s) responsive to a change in color stimuli presented to the patient is greater than the threshold, or is within the range, when normal color vision is exhibited. Alternatively or in addition, the standard testing data may also indicate a pattern of gaze angles expected in response to the change in color stimuli in a patient exhibiting normal color vision. The data analysis component 134 may compare the gaze angle measurement(s) generated by the measurement component 132 with the indicated standard pattern of gaze angles to determine whether normal color vision is being exhibited. The comparison may be based on variability of gaze angles, duration of fixation of gaze angles, location of fixation points, alignment of the gaze angle with the direction of the color stimulus, and the like. Separate threshold(s) and/or range(s) may be indicated for different types of measurement(s). In addition, the threshold and/or range may be the same for each transition from a color stimulus to a subsequent color stimulus of the plurality of color stimuli, or may be different for different transitions. For instance, the transition from a first color stimulus to a second color stimulus may be associated with a first threshold and/or range, whereas the transition from the first color stimulus to a third color stimulus may be associated with a second threshold and/or range, different from the first threshold and/or range. The threshold(s) and/or range(s) associated with the color vision screening test may also be based on the testing category of the patient 106 e.g., the age group or disability status of the patient 106, where the threshold(s) and/or range(s) may be different for different testing categories. The data analysis component 134 may store patient data, measurements associated with vision screening tests, test results, and other data in a database(e.g., in the screening database 140) for comparison of data over time to monitor vision health status and changes in vision health.
Based on the comparison with a threshold and/or range described above, the data analysis component 134 may generate a pass/fail determination for each transition in the plurality of color stimuli displayed in a sequence to the patient 106. For example, if the change in refractive error determined by the measurement component 132 in response to a change in color stimuli exceeds a threshold value or falls within a range of the standard testing data, a pass determination may be made by the data analysis component 134, and a fail determination made otherwise. Alternatively, or in addition, the data analysis component 134 may generate a pass/fail determination for each color stimulus in the plurality of color stimuli displayed to the patient, based on the measured pattern of gaze angles matching the pattern of gaze angles indicating a normal color vision response to the color stimuli.
As discussed, a recommendation and/or a diagnosis associated with the color vision capability of the patient can be determined by comparing the measurements of refractive error of the patient's eyes when viewing different color stimuli. In examples, the system may compare the measurements with standard, or predetermined, measurements known to be associated with normal color vision capability. For example, criteria such as known thresholds and/or range of values of changes in the refractive error associated with normal color vision, may be compared with the measurements obtained during the color vision screening test, to determine whether the patient's eye(s) exhibit deficiencies related to color vision. In other examples, measurements of gaze angles obtained during the color vision screening test may be used to determine whether the patient exhibits a pattern of gaze angles corresponding to a normal color vision response. Depending on whether the measurements satisfy the criteria for normal color vision, the system may generate a diagnosis and/or recommendation associated with the patient. For example, if the measurements satisfy the criteria, the system may generate a recommendation indicating that the patient has passed the vision screening test. If the measurements do not satisfy the criteria, the system may generate a recommendation including an indication that the patient has failed the screening, an indication of a diagnosis of a type of color vision deficiency exhibited by the patient, or a recommendation for additional screening. In examples, the system may also utilize one or more machine learning techniques to generate the diagnosis recommendation. The recommendation, diagnosis and/or the measurements, may be presented to the operator of the device via an interface of the device. For example, the recommendation may be displayed to the operator on a display screen 118 of the vision screening device 104. In examples, the operator display screen may not be visible to the patient, e.g., the operator display screen may be facing in a direction opposite the patient. The operator display screen may also display information related to the vision screening tests, patient data, progress of the screening, and/or measurements obtained during the screening to the operator.
The computer-readable media 124 may additionally store an output generation component 136. The output generation component 136 may be configured to receive, access, and/or analyze data from the data analysis component 134. For example, the output generation component 136 may utilize the pass/fail determinations of the data analysis component 134 to determine if the patient 106 is exhibiting “normal” color vision. For example, a patient with normal color vision, also known as a “trichromat” vision uses three types of light cones in the eye(s) and can typically perceive up to one million different shades of colors. The data analysis component may generate an output 112 that may include a diagnosis associated with the color vision capability of the patient and/or a recommendation for further action(s). For example, a fail determination for one or more of the transitions in color stimuli may indicate an abnormality in color vision. In case an abnormality is determined, the specific transitions between the color stimuli for which the patient's measurement data resulted in a fail determination, may be used to further determine a type of color vision deficiency that is being exhibited by the patient 106. If normal color vision is determined, the output generation component 136 may generate output 112 indicating that the patient 106 has passed the color vision screening test. Alternatively, if abnormality is determined, the output generation component 136 may generate output 112 indicating that the patient 106 has failed the color vision screening test and/or indicating that the patient 106 should receive additional screening. The type of color vision deficiency determined may also be indicated in the output 112. The output 112 may be displayed to the operator 102 on a display screen(s) 118 of the vision screening device 104, or a screen associated with the color vision screening test e.g., on a computer display screen.
Although
The network interface(s) 148 may enable wired and/or wireless communications between the components and/or devices shown in system 100 and/or with one or more other remote systems, as well as other networked devices. For instance, at least some of the network interface(s) 148 may include a personal area network component to enable communications over one or more short-range wireless communication channels. Furthermore, at least some of the network interface(s) 148 may include a wide area network component to enable communication over a wide area network. Such network interface(s) 148 may enable, for example, communication between the vision screening system 110 and the vision screening device 104 and/or other components of the system 100, via the network 108.
It should be understood that, while
As discussed herein,
The radiation sources 210 may emit a plurality of radiation beams, including radiation beam 212A and 212B, configured to illuminate the eye(s) of the patient 206. The reflected visible and/or NIR radiation 212C from the eye(s) of the patient 206 may be captured via the sensor(s) 208. The vision screening device 202 may detect the pupils and/or lenses of the eyes of the patient 206 and/or acquire images and/or video data of the pupils/lenses via the sensor(s) 208. In examples, the NIR radiation 212C may be used to capture, via the sensor(s) 208, pupil images to determine the refractive error and/or the gaze angle of the eye(s) of the patient 206. Examples of capturing pupil images and determining corresponding refractive errors associated with the eyes of the patient are described in the disclosure of, for example, U.S. Pat. No. 9,237,846, referred to above and incorporated herein by reference.
The sensor(s) 208 may include optical components, such as one or more lenses, windows, prisms, filters, mirrors, and/or any other devices configured to collect and direct the radiation beams generated by the radiation sources 210, including radiation beam 212A and 212B, to the eye(s) of the patient 206. In some further examples, the optical components may also comprise a collimating lens, a convergent, lens, a divergent lens, and/or any other substantially transparent lens or series of lenses configured to assist in directing the reflected beam 212C from the eye(s) of the patient 206 to impinge the sensor(s) 208. In examples, the radiation beams, such as beams 212A, 212B, and the reflected beam 212C from the patient's pupils may pass through a transparent view window 216 facing the patient. Non-optical components of the vision screening device 202 may include, for example, an operator display screen 214. It is noted that the vision screening device 202 is not limited to the components listed here, and may incorporate additional components for furthering vision screening techniques.
As described herein,
As shown in
In some examples, the view window 318 may also function as a display screen facing towards the patient, which may be transparent to allow radiation to pass through without change. Alternatively, or in addition, only a portion of the view window 318 may be transparent to allow radiation to pass through to reach the sensor 324, while the remaining portion of the view window 318 may comprise a non-transparent display screen. The display screen may be used to present visual stimuli, such as the plurality of color stimuli used during the color vision screening test, to the patient. It is noted that the vision screening device 300 is not limited to the components listed here, and may incorporate more or fewer components for furthering vision screening techniques.
In some examples, the sensor 324 includes, for example, a complementary metal-oxide semiconductor (CMOS) sensor array, also known as an active pixel sensor (APS), or a charge connected device (CCD) sensor. In some examples, the lens component 322 is supported by the vision screening device 300 and positioned in front of the sensor 324. In still further examples, the sensor 324 has a plurality of rows of pixels and a plurality of columns of pixels. For example, the sensor 324 may include approximately 1280 by 1024 pixels, approximately 640 by 480 pixels, approximately 1500 by 1152 pixels, approximately 2048 by 1536 pixels, and/or approximately 2560 by 1920 pixels. The sensor 324 may be capable of capturing approximately 25 frames per second (fps), approximately 30 fps, approximately 35 fps, approximately 40 fps, approximately 50 fps, approximately 75 fps, approximately 100 fps, approximately 150 fps, approximately 200 fps, approximately 225 fps, and/or approximately 250 fps. Note that the above pixel values and frames per second are exemplary, and other values may be greater or less than the examples described herein.
In examples, the sensor 324 may include photodiodes having a light-receiving surface and have substantially uniform length and width. During exposure, the photodiodes convert the incident radiation to a charge. The sensor 324 may be operated as a global shutter. For example, substantially all of the photodiodes may be exposed simultaneously and for substantially identical lengths of time. Alternatively, the sensor 324 may be used with a rolling shutter mechanism, in which exposures move as a wave from one side of an image to the other. Other mechanisms are possible to operate the sensor 324 in yet other examples. The sensor 324 may also be configured to capture digital image data. The digital image data can be captured in various formats, such as JPEG, BITMAP, TIFF, etc.
As discussed herein,
In examples, the radiation 402 emitted by one or more LEDs in the LED array 326 passes through the diffuser 316 and strikes the beam splitter 318. The diffuser 316 may act as a blur smoothing filter for light emitted by the LEDs in the LED array 326. At least a portion 404 of the radiation 402 reflects off of the beam splitter 318 and is directed to one or more eyes of the patient 206. While the portion 404 of the radiation 402 is directed at the eye(s) of the patient 206, the sensor 324 captures one or more images and/or video of the eye(s) of the patient 206. In examples, the image(s) and/or video may depict radiation that is reflected by the pupil(s) of the eye(s) of the patient 206 e.g., for use in the determination of the refractive error and/or the gaze angle of the eye(s).
An expanded view 406 of
As discussed with reference to
In alternative embodiments, the color LEDs may not be included in the LED array 326, and instead, be disposed on external surfaces of the housing 302 of the vision screening device 300.
In an alternative arrangement, the color LEDs 508 may be disposed on a side panel 504 on the first side 312 of the vision screening device 300, as shown in
As discussed with reference to
As discussed, the plurality of color stimuli may be generated by color radiation sources, such as the color LEDs 514 or the color LEDs 410 of the LED array 326 as described above, or may be presented to the patient as color images displayed on a display screen, such as the display screen 510. In yet another alternative example, the vision screening device 300 shown in
In any of the examples described herein (e.g., with reference to any of the example display screen configurations described herein), the plurality of color stimuli displayed during the color vision screening test may include a first color stimulus that is a standard colored image without any color shift. Such a standard image should elicit a baseline response from a viewer, and the image may comprise a digital image illustrating one or more objects, in color. In such examples, the plurality of color stimuli may also include a second color stimulus that is color shifted (e.g., that has been modified to accent, highlight, or otherwise emphasize) toward a first color (e.g., red). For instance, the second color stimulus may comprise the same image illustrated in the first color stimulus, but the pixels, LEDs, or other components of the display screen used to illustrate the one or more objects in the image may be controlled by the processor(s) 122 and/or the display screen to accentuate existing red colors present in the image. Additionally or alternatively, the processor(s) 122 and/or the display screen may control such components to shift, transition, change, and/or otherwise modify colors characterized by wavelengths that are close to red wavelengths in the visible spectrum (e.g., orange colors, yellow colors, etc.) such that they appear red or at least partly red in the second color stimulus. In such examples, the degree to which such colors are modified may be directly related to the difference between the wavelength of the respective color and the wavelength of red light. It is understood that the wavelength of red light is between approximately 620 nm and approximately 750 nm. For instance, objects with colors having a wavelength closer to red wavelengths (e.g. orange light, having a wavelength between approximately 620 nm and approximately 590 nm) may be modified to appear more red in the second color stimulus relative to objects with other colors (e.g., yellow light, having a wavelength between approximately 590 nm and approximately 560 nm) having a wavelength further from the red wavelengths. Alternatively, in some embodiments objects with colors beyond a desired color threshold may be shifted or otherwise modified to have a different color. For example, objects with colors having a wavelength above approximately 590 nm (e.g., orange) may be modified, in whole or in part, to have a wavelength of approximately 700 nm (e.g., red).
In any of the examples described herein, the plurality of color stimuli may further include a third color stimulus that is color shifted toward a second color different from the first color (e.g., green). For instance, the third color stimulus may comprise the same image illustrated in the first color stimulus, but the pixels, LEDs, or other components of the display screen used to illustrate the one or more objects in the image may be controlled by the processor(s) 122 and/or the display screen to accentuate existing green colors present in the image. Additionally or alternatively, the processor(s) 122 and/or the display screen may control such components to shift, transition, change, and/or otherwise modify colors characterized by wavelengths that are close to green wavelengths in the visible spectrum (e.g., yellow colors, cyan colors, etc.) such that they appear green or at least partly green in the third color stimulus. Further, although described above as comprising the same image, it is understood that in further examples, the first color stimulus may comprise a digital image illustrating one or more objects, in color, and at least one of the second color stimulus or the third color stimulus may comprise a different digital image illustrating one or more different objects.
In any of the examples described herein, the measurement component 132 and/or the processor(s) 122 may receive first data captured by the one or more sensors 324 of the vision screening device during the period of presentation of the first color stimulus, second data captured during the period of presentation of the second color stimulus, and third data captured during the period of presentation of the third color stimulus. Additionally, the measurement component 132 and/or the processor(s) 122 may receive data captured by the one or more sensors 324 during the transition periods between the presentation of the first color stimulus, the second color stimulus, and/or the third color stimulus. The measurement component 132 and/or the processor(s) 122 may determine a first refractive error of the eye(s) based on the first data, a refractive error of the eye(s) based on the second data, and a third refractive error of the eye(s) based on the third data. The measurement component 132 and/or the processor(s) 122 may determine such first, second, and third refractive errors based on any of the processes described herein. The measurement component 132 and/or the processor(s) 122 may also determine a change in the refractive error responsive to the change in color stimuli. For instance, the measurement component 132 and/or the processor(s) 122 may determine a first difference between the first refractive error and the second refractive error. The measurement component 132 and/or the processor(s) 122 may also determine a second difference between the first refractive error and the third refractive error. The measurement component 132 and/or the processor(s) 122 of the vision screening device may also compare one or more such differences to corresponding difference thresholds indicative of normal color vision. For instance, when viewing the color stimuli described above, a patient with normal color vision may exhibit between at least an approximately ¼ and ½ diopter shift as between the first color stimulus and the second color stimulus. A patient with normal color vision may also exhibit between an approximately ¼ and ½ diopter shift as between the first color stimulus and the third color stimulus. It is understood that in some examples, other thresholds may be used. If the measurement component 132 and/or the processor(s) 122 determines that the first difference and the second difference are greater than or equal to the corresponding difference thresholds, the measurement component 132 and/or the processor(s) 122 may, based on such a determination, indicate or otherwise determine that the patient has normal color vision. On the other hand, if the measurement component 132 and/or the processor(s) 122 determines that the first difference and/or the second difference is less than the corresponding difference thresholds, the measurement component 132 and/or the processor(s) 122 may, based on such a determination, indicate or otherwise determine that the patient suffers from color blindness.
In some examples, the thresholds noted above may be selected based on the age, gender, ethnicity and/or other characteristics of the patient. Additionally, different thresholds may be selected based on the type of color stimulus presented to the patient. For example, a first threshold or set of thresholds may be employed when presenting red shifted image(s) as the second color stimulus. In such an example, a second threshold or set of thresholds may be employed when presenting green shifted image(s) as the third color stimulus. In this way, refractive error differences may be compared to corresponding red/green difference thresholds to determine whether the patient suffers from red/green color blindness. On the other hand, a different set of thresholds may be employed when presenting red/blue shifted images as the second and third color stimuli, respectively. Thus, the above process may be utilized to not only determine the presence of protanomaly, deuteranomaly, and/or tritanomaly, but depending on the thresholds used and the color shifting of the stimuli presented, the measurement component 132 and/or other computation components of the vision screening device may also be configured to determine the type of color deficiency detected.
In some examples, the component screens 610 and 612 of the display screen 608 may display different content. For example, the component screen 610 may display the plurality of color stimuli, whereas the component screen 612 of the screen may display instructions or indicate a progress status of the color vision screening test. The component screens 610 and 612 of the display screen 608 may also be used to provide visual stimuli to the left and right eye of the patient respectively e.g., the component screen 610 may display visual stimulus directed to the left eye, whereas the component screen 612 may display visual stimulus directed to the right eye of the patient.
The display screens 602, 604, 606, 608, 618 shown in
In various examples, as described herein with reference to
The operations described below with respect to the methods illustrated in
At operation 704, the emitter control component 130 and/or one or more processors associated therewith causes a second radiation source, separate from the first radiation source, to emit radiation e.g., near-infrared (NIR) radiation. For example, the second radiation source may comprise NIR LEDs of the radiation source(s) 114 of the vision screening device 104, NIR LEDs 326(b) of
The emitter control component 130 may also select and set an LED pattern of the LED array 326 for activation. In some examples, the arrangement of the NIR LEDs 326(b) in the LED array 326 allows for different illumination patterns to be presented to the eye(s) of the patient 106. The measurement accuracy of the refractive error of the eye(s) may depend upon the illumination pattern selected. Additional details regarding illumination patterns used in examination protocols for determining refractive error can be found in U.S. patent application Ser. No. 9,237,846, referred to above and incorporated herein by reference.
At operation 706, the measurement component 132 may activate radiation sensor(s) 116 of the vision screening device 104 to capture first NIR radiation reflected by the eye(s) of the patient and responsive to the first color stimulus, and second NIR radiation reflected by the eye(s) of the patient and responsive to the second color stimulus. For example, the first NIR radiation may be captured during the presentation of the first color stimulus, and the second NIR radiation may be captured during the presentation of the second color stimulus. The measurement component 132 may receive first data indicative of the first NIR radiation captured, and second data indicative of the second NIR radiation captured by the sensor e.g., radiation sensor(s) 116. The first data and the second data may include image(s) and or video of the eye(s) illuminated by NIR radiation. In some examples, the measurement component 132 may also determine a first pattern of gaze angles based on the first data, and a second pattern of gaze angles based on the second data.
At operation 708, the measurement component 132 may determine a first value of a measurement and a second value of a measurement associated with the eye(s) of the patient. In examples, as discussed, the measurement may be the refractive error of the eye(s). The measurement component 132 may determine a first value of the refractive error of the eye(s) from the first data indicative of the first NIR radiation, and a second value of the refractive error of the eye(s) from the second data indicative of the second NIR radiation, using the techniques described in U.S. Pat. No. 9,237,846, referred to above and incorporated herein by reference, for determining refractive error from pupil images captured under NIR radiation illumination.
At operation 710, the measurement component 132 may determine a difference between the first value of the refractive error responsive to the first color stimulus, and the second value of the refractive error responsive to the second color stimulus. As discussed, in a patient of normal color vision, the refractive error measurement of the eye may change when subjected to a change in color stimuli e.g., by presenting the first color stimulus, followed by the second color stimulus different from the first color stimulus at operation 702. For example, the difference between the first value and the second value may be in the range of 0.25 to 0.5 diopters in some examples of a patient with normal color vision. The difference may be greater or less than this range based on individual differences and color vision capability of the patient.
At operation 712, the data analysis component 134 may compare the difference between the first value and the second value obtained at operation 710 with a threshold value to determine that the difference is less than the threshold value. For example, the threshold value may be predetermined and available as a part of standard testing data, which may be stored in the screening database 140 or the computer-readable media 124, 144. The threshold may correspond to a lower threshold of difference between values of the refractive error in the eye in response to a change in color stimulus from the first color stimulus to the second color stimulus. If the difference determined at operation 710 is less than the threshold value, the output generation component 136 may generate a first output associated with the patient at operation 714, and if the difference is equal to or higher than the threshold value, the output generation component 136 may generate a second output at operation 716. The first output at operation 714 may correspond to an indication that the patient has failed the color vision screening test, a recommendation of additional screening, and/or a diagnosis of a type of color vision deficiency determined by the first color stimulus and the second color stimulus. Whereas the second output at operation 716 may correspond to an indication of that the patient has passed the color vision screening, or that the patient exhibits normal color vision.
In an alternative example, where the pattern of gaze angles is measured to evaluate color vision capability as discussed, one or more operations of process 700 may be omitted or modified. For example, at operation 708, the first value of measurement may correspond to a pattern of gaze angles determined in response to a color stimulus generated by the first radiation source 114 at operation 702. In such examples, the pattern of gaze angles may be determined based on the reflected radiation captured by radiation sensor(s) 116 at operation 706. At operation 710, a difference may be determined (e.g., by the data analysis component 134) between the pattern of gaze angles and a standard pattern of gaze angles associated with normal color vision. Instead of comparing the difference to a threshold as indicated at operation 712, differences in variability of gaze angles, duration and location of fixation of gaze angles, alignment with the direction of the color stimulus, and the like may be analyzed by the data analysis component 134 to determine if a first output corresponding to an abnormality is generated at operation 714, or a second output corresponding to normal color vision response is generated at operation 716.
As discussed, the example process 700 may be performed by the components of the vision screening device 104 executed by the processor(s) 122 of the device 104. The example process 700 illustrates operations performed during at least a part of a color vision screening test administered to a patient. In alternative examples, some or all of the operations of process 700 may be executed by processor(s) 142 of a vision screening system 110 that is connected to the vision screening device 104 via network 108.
At operation 802, the patient data component 126 may receive patient data 138 associated with a patient participating in a vision screening test, such as the color vision screening test. As described herein, a patient being evaluated by the vision screening system may provide information to the system. In some examples, the patient may manually enter such information via one or more touchscreens or other input devices of the vision screening device. In additional examples, a physician or other operator of the device may manually enter such information via such input devices. Additionally, or alternatively, the system may receive and/or access data associated with the patient from a database of the system storing information associated with patients, such as the screening database 140. The received and/or accessed data may be analyzed by the system to determine one or more characteristics associated with the patient, such as demographic information, physical characteristics of the patient, etc. Still further, in some examples, the system may generate image or video data associated with the patient and may analyze the image/video data to determine the characteristics associated with the patient.
At operation 804, the patient screening component 128 may determine a plurality of color stimuli for display. The plurality of color stimuli may be displayed, one at a time and in sequence, to a patient participating in a color vision screening test. As described herein, the patient data received at operation 802 may be utilized to determine a testing category associated with the patient e.g., a testing category based on age, gender, disability, and the like. The plurality of color stimuli for display to the patient may be selected from an existing library of color stimuli stored in a database of the system, such as the screening database 140, based on the patient data and/or the testing category associated with the patient. For example, the color stimuli may include color dot patterns, time varying color patterns, black figures including a color fringe on a white background, graphics of a first color on a background of a second color different from the first color, and the like. For example, color stimuli comprising color dot patterns may be determined if the testing category indicates a toddler patient, or time-varying color patterns may be selected for a patient with a disability involving short attention spans. For an adult patient who is able to read, the color stimuli selected for display may be text or other graphics with color fringing on a white background or colored text or other figures on a different colored background. In addition to the color stimuli, the database may also store a duration of display for each stimulus. The color stimuli and duration of display may be based on psychophysical characteristics of human color vision, and configured to elicit a difference in values of a measurement associated with the eye, such as a refractive error and/or a gaze angle, responsive to the change in color stimuli. In any of the examples described herein, and as described above with reference to any of the example display screen configurations described herein, at operation 804, the plurality of stimuli determined by the patient screening component 128 may include a first (e.g., baseline) color stimulus comprising an image illustrating an object, a second color stimulus modified to accentuate first (e.g., red) colors present in the image, and/or a third color stimulus modified to accentuate second (e.g., green) colors present in the image different from the first colors.
At operation 806, the vision screening device 104 may display a color stimulus from the plurality of color stimuli determined at operation 804. The color stimulus displayed may be the next color stimulus in the sequence, starting with the first color stimulus at the start of the vision screening test, and the color stimulus may be displayed for a duration indicated for the color stimulus at operation 804. The display of the color stimuli may be controlled by the emitter control component 130, and additional parameters of the display may include intensity, pattern, cycle time, and so forth. The color stimulus may be displayed on a display screen 118, such as an LCD or OLED screen, associated with the vision screening device 104. The display screen may be facing a direction towards the patient so that the display is visible to the patient. In alternative examples, the display of color stimuli comprising color dot patterns, or time-varying color dots, may be generated by color LEDs of the radiation source(s) 114 associated with the vision screening device 104. As described herein, the color LEDs may be embedded in an LED array, or distributed on a housing of the vision screening device, facing in a direction towards the patient. In any of the examples of color stimuli described herein, the color stimulus may be also be fogged or defocused. A defocused stimulus may reduce the ability of the patient's eye(s) to accommodate while viewing and/or focusing on the stimulus, which may result in a more accurate determination of the refractive error of the eye.
At operation 808, the measurement component 132 may determine the refractive error of the eye(s) of the patient responsive to the color stimulus displayed at operation 806. The refractive error may be determined based on characteristics of NIR radiation reflected from the pupil of the eye(s) of the patient, for example. As described herein, the radiation source(s) 114 of the vision screening device 104 may include NIR LEDs configured to emit NIR radiation directed to the eye(s) of the patient. The reflected radiation may be captured by a sensor of the radiation sensor(s) 116 of the vision screening device 104, and used for the determination of refractive error as described in U.S. Pat. No. 9,237,846, referred to above and incorporated herein by reference.
At operation 810, the vision screening device 104 may determine if there are color stimuli remaining to be displayed from the plurality of color stimuli determined at operation 804. As discussed, each color stimulus of the plurality of color stimuli is to be displayed one at a time, in a sequence. The emitter control component 130 may keep track of the color stimulus that was displayed at operation 806, and an index indicating its position in the plurality of color stimuli. If the index of the color stimulus at operation 806 is at a last position in the plurality of color stimuli, then there are no further color stimuli remaining, and the process 800 moves to operation 812 as shown. If the position of the color stimulus at operation 806 is not the last position, then there are additional color stimuli remaining to be displayed, and the process 800 reverts back to operation 806 to display the next color stimulus in the sequence from the plurality of color stimuli.
At operation 812, the data analysis component 134 may determine differences between refractive errors obtained at operation 808 in response to the display of each of color stimulus, at operation 806, of the plurality of color stimuli. As described herein, the plurality of color stimuli is displayed one at a time and in sequence at operation 806, and a refractive error is determined for each color stimulus at operation 808. For example, a first refractive error may be determined responsive to a first color stimulus, and a second refractive error may be determined based on a second color stimulus, and a difference between the first refractive error and the second refractive error determined at operation 812. A difference or change in a value of refractive error is determined for the display of each color stimulus and the display of the subsequent color stimulus, for each transition of color stimulus in the sequential display of the plurality of color stimuli. In some examples, at operation 812, the data analysis component 134 and/or the processor may determine the presence of color blindness based on the determined differences. Additionally or alternatively, depending on the thresholds used and the color shifting of the stimuli presented, the measurement component 132, the data analysis component 134, and/or other computation components of the vision screening device may also determine the type of color deficiency detected. For instance, as described herein, and in examples in which a first (e.g., a baseline) color stimulus and a second color stimulus are presented, a first threshold or set of thresholds may be employed when presenting red shifted image(s) as the second color stimulus. If a third color stimulus is presented to the patient in such an example, a second threshold or set of thresholds may be employed when presenting green shifted image(s) as the third color stimulus. In this way, refractive error differences may be compared to corresponding red/green difference thresholds to determine whether the patient suffers from red/green color blindness. A similar process may be used, at operation 812, when testing for other types of color blindness.
At operation 814, the output generation component 136 generates an output based on the differences determined at operation 812. The data analysis component 134 may access standard testing data, e.g., from the screening database 140, indicating a threshold and/or range of change in refractive error corresponding to each transition of color stimulus to the subsequent color stimulus in the sequence, corresponding to normal color vision. Based on a comparison with the threshold and/or range established in the standard testing data, the data analysis component 134 may determine whether the difference in refractive error, determined at operation 812 for each transition, exceeds the threshold, or is within the range, indicated for normal color vision. If the difference is less than the threshold, or outside the range, an abnormality in color vision is determined. Otherwise, a normal color vision response is determined. If every transition of color stimulus is determined to produce a normal color vision response, e.g., the difference in refractive error measurement exceeds the threshold, or falls within the range of normal color vision, the output generated at operation 814 may indicate that the patient exhibits normal color vision, or that the patient has passed the vision screening test. Instead, if one or more of the transitions are determined to show abnormality, the output may indicate that the patient has failed the vision screening test. In this instance, the output may also include a recommendation that the patient needs further screening, or may include a diagnosis of a type of color vision deficiency based on the abnormalities determined. The output may be displayed to an operator of the vision screening device 104 on the display screen(s) 118.
Based at least on the description herein, it is understood that the vision screening devices and associated systems and methods of the present disclosure may be used to assist in performing one or more vision screening tests, including a color vision screening test. The components of the vision screening device described herein may be configured to determine and present a plurality of color stimuli to present to a patient undergoing vision screening, determine a measurement associated with the eye of the patient responsive to the color stimuli, and determine an output indicating a diagnosis, recommendation or results of the screening test. An exemplary vision screening device may include radiation source(s) and/or display screen(s) for generating the plurality of color stimuli, radiation source(s) and sensor(s) for generating near-infrared radiation for determining a refractive error of the eye of the patient, and display screen(s) for displaying the output to an operator of the vision screening device. The device described herein may be used for screening a patient for color vision deficiency without requiring inputs or feedback from the patient, thereby allowing the device to be used for screening young or uncooperative patients.
The foregoing is merely illustrative of the principles of this disclosure and various modifications can be made by those skilled in the art without departing from the scope of this disclosure. The above described examples are presented for purposes of illustration and not of limitation. The present disclosure also can take many forms other than those explicitly described herein. Accordingly, it is emphasized that this disclosure is not limited to the explicitly disclosed methods, systems, and apparatuses, but is intended to include variations to and modifications thereof, which are within the spirit of the following claims.
As a further example, variations of apparatus or process limitations (e.g., dimensions, configurations, components, process step order, etc.) can be made to further optimize the provided structures, devices and methods, as shown and described herein. In any event, the structures and devices, as well as the associated methods, described herein have many applications. Therefore, the disclosed subject matter should not be limited to any single example described herein, but rather should be construed in breadth and scope in accordance with the appended claims.
This application is a nonprovisional of, and claims priority to, U.S. Provisional Application No. 63/232,999, filed Aug. 13, 2021, the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63232999 | Aug 2021 | US |