Vision screening in both clinical and basic science is a critical step that quantifies functional deficits in the visual system. In clinical practice, vision screening is essential for disease diagnosis and monitoring. In basic science, it can quantify sensory or perceptual performance or ensure that research participants meet specific study inclusion or exclusion criteria.
Recent social distancing measures and developments in communication, display and sensor technologies mean that remote vision screening may play a significant role in teleophthalmology. Clinical guidelines recommend vision screening at multiple year intervals. However, significant vision changes could go undetected in these long intervals, especially for gradual loss. Self-administered vision screening can serve an important home monitoring role between clinic visits, particularly in remote and medically underserved locations.
The human visual system includes multiple interdependent pathways that are structurally and functionally specialized and may be selectively affected across the lifespan. Comprehensive vision screening therefore ideally requires the administration of multiple tests that assess the integrity of different visual pathways. However, practical limitations limit the number of tests that can be administered. Furthermore, in many cases, vision tests require the subject to learn a new task or a new set of stimuli for each test and to complete many trials where they are forced to guess because the paradigm requires the presentation of sub-threshold stimuli. These factors can be frustrating for subjects and may confound attention, learning and memory effects with visual function deficits. Existing technology requires compromises in the number and duration of tests administered, with the risk that the vision screening is inaccurate due to noisy or under-constrained data, or incomplete because only a subset of tests is administered.
The present technology provides methods and devices for rapid, self-administered, and adaptive testing of a wide variety of visual and neurological impairments. The methods are based on graphical presentation to a subject of visual stimuli of graded intensity and the use of psychometric functions to determine the subject's sensitivity to selected stimuli. Diagnosis of ophthalmic, optometric, and/or neurologic conditions is achieved from the subject's stimulus sensitivity pattern.
The technology can be further summarized by the following list of features.
1. A method for testing a visual or neurological function of a human subject, the method comprising:
(a) providing a device having a graphical display and a user input;
(b) displaying sequentially a set of grids on the display, each grid comprising a plurality of cells; wherein each grid comprises a visual stimulus displayed in two or more of the cells of the grid; wherein the visual stimulus displayed within a grid varies in intensity from cell to cell; and wherein the stimulus displayed for each grid differs from the stimulus displayed for at least one other grid of the set;
(c) receiving subject responses through the user input, the responses indicating a perceived characteristic of the stimulus for each cell of each displayed grid; and
(d) analyzing the subject responses from each grid using a sensitivity function to obtain the subject's responsiveness to each of the stimuli in the set of grids, said responsiveness characterized as a probability of reporting the stimulus as a function of stimulus intensity.
2. The method of feature 1, further comprising:
(e) analyzing the subject's responsiveness to two or more of the stimuli of the set of grids to obtain a pattern of responsiveness of the subject.
3. The method of feature 2, further comprising:
(f) comparing the subject's pattern of responsiveness to one or more known patterns of responsiveness; and
(g) identifying a presence or absence in the subject, or a likelihood thereof, of one or more visual or neurological conditions.
4. The method of any of the preceding features, wherein the perceived characteristic of the stimulus comprises one or more stimulus characteristics selected from the group consisting of absence, presence, luminance, contrast, color, depth, motion, flicker, spatial form, object recognition, object shape, object form, object size, facial recognition, facial feature recognition, feature position, feature angle, spatial resolution, noise-defined depth, and sparse-pattern depth.
5. The method of any of the preceding features, wherein the stimulus intensity within a grid spans a range from difficult-to-detect to easy-to-detect for the subject.
6. The method of any of the preceding features, wherein the position of stimulus-containing within the grid is random or non-random.
7. The method of any of the preceding features, wherein the stimuli in the cells of each grid are displayed only one at a time, with all other cells of the grid remaining blank until the subject's response is obtained for the displayed cell.
8. The method of any of the preceding features, wherein format of one or more grids comprises a variable number of rows and columns.
9. The method of any of the preceding features, wherein one or more grids are displayed for each stimulus.
10. The method of any of the preceding features, wherein stimulus type or stimulus intensity within a grid are varied from an earlier presented grid based upon subject responses.
11. The method of any of the preceding features, wherein the subject responds for each cell of a grid whether the stimulus is present or not present in the cell, and wherein the subject's sensitivity to the stimulus displayed in each grid is calculated.
12. The method of any of the preceding features, wherein the subject indicates a degree of confidence in their response for each cell based on a position of their response or a secondary response.
13. The method of any of the preceding features, wherein the sensitivity function is a d-prime function, defined as:
where τ is the sensitivity threshold (stimulus intensity where d′=1), β is an upper asymptote of the saturating function (stimulus intensity where d′=5), s is signal intensity, and y is slope of the function; and wherein d′(s) is related to the probability of the subject reporting the presence of the stimulus as a function of stimulus intensity by the following psychometric function:
Ψyes(s)=1−G(z(1−Ψyes(0)−d′(s))
where G(s) is a cumulative Gaussian function, z is a z-score, and Ψyes(0) is false alarm rate.
14. The method of feature 13, wherein the psychometric function is computed on-the-fly for each grid and is used to estimate a stimulus for which d′=0.1, which is very difficult for the subject to detect, and a stimulus intensity for which d′=4.5, which is very easy for the subject to detect.
15. The method of any of feature 13 or 14, wherein the test is optimized for the subject by performing two or more trials of the set of grids, wherein the stimulus intensities on the first trial are based on data from previous observers or on physical stimulus limits of the display, and wherein the stimulus intensities on subsequent trials are based on the estimate of sensitivity computed for all previous grids for the current observer.
16. The method of any of the preceding features, wherein both threshold stimulus intensity and suprathreshold performance of the subject are determined.
17. The method of feature 16, wherein individual cells comprise two or more stimuli, and the subject's response comprises discrimination between the two or more stimuli.
18. The method of any of features 1-12, wherein the sensitivity function is an orientation error function, defined as:
where τ is a sensitivity threshold, θi is intrinsic orientation uncertainty within the subject's visual system, s is signal intensity and y is the slope of the function.
19. The method of any of features 1-12, wherein the sensitivity function is a cumulative Gaussian function, defined as:
where τ is a sensitivity threshold, pguess is the probability of a correct response for a guess (equal to the reciprocal of the number of alternative response choices), s is signal intensity and y is the slope of the function.
20. The method of any of features 1-12, wherein sensitivity to one or more of the stimuli can vary in two or more dimensions, and wherein a known relationship exists between said one or more stimuli and two or more types of subject sensitivity thereto.
21. The method of feature 20, wherein the two or more types of subject sensitivity comprise spatial frequency and contrast, and wherein the known relationship is defined by:
22. The method of feature 20, wherein the two or more types of subject sensitivity comprise spatial frequency, temporal frequency, and contrast, and wherein the known relationship is defined by:
23. The method of feature 20, wherein the two or more types of subject sensitivity comprise color saturation and hue angle, and wherein the known relationship is defined by:
wherein τ is visual color sensitivity for stimulus sensitivity s, h is hue angle, and k is color saturation.
24. The method of feature 20, wherein the two or more types of subject sensitivity comprise stimulus variance and response variance, and wherein the known relationship is defined by an equivalent noise function defined as:
wherein τ is visual detection threshold for stimulus intensity s, σint is intrinsic noise in the observer's visual system, σiex is external noise in the stimulus, and Nsamp is sampling efficiency, corresponding to the number of stimulus samples employed by the observer.
25. The method of feature 20, wherein the two or more types of subject sensitivity comprise stimulus pedestal intensity and sensitivity, and wherein the known relationship is defined by a dipper or threshold versus intensity function defined as:
wherein τ is visual detection threshold for stimulus intensity s, σint is intrinsic noise in the observer's visual system, σiex is the intensity of the stimulus pedestal, and S is the discrimination criterion employed by the observer.
26. The method of any of the preceding features, wherein the subject provides responses using a touch-sensitive display screen, computer pointing device, or speech recognition software.
27. The method of any of the preceding features, wherein the method is supervised or self-administered by the subject outside of a medical facility, vision testing facility, or doctor's office.
28. The method of any of the preceding features, wherein the method is repeated after one or more time intervals.
29. The method of any of the preceding features, wherein the method is used to detect and/or monitor the progression of an ophthalmic condition, an optometric condition, or a neurologic disease or condition.
30. The method of feature 29, wherein the ophthalmic disease or condition is selected from the group consisting of age-related macular degeneration and other disorders of early visual neural pathways; diabetic retinopathy; color vision deficit; glaucoma; and amblyopia.
31. The method of feature 29, wherein the optometric condition is selected from the group consisting of myopia, hyperopia, and other optical aberrations of lower and higher order; presbyopia; astigmatism; and cataract, corneal edema, and other changes in optical opacity.
32. The method of feature 29, wherein an optometric or ophthalmic condition is detected or monitored, and wherein visual acuity is determined using as stimulus an oriented arc in each cell, wherein the arc comprises a gap whose angular position is registered by the subject as a measure of arc orientation.
33. The method of feature 32, wherein the arc comprises a line width that is ⅕ of the arc diameter and the gap angle is equal to the line width.
34. The method of feature 32, wherein the angular position of the gap is registered by the subject at a cell boundary as a measure of arc orientation.
35. The method of feature 32, wherein a series of cells are provided to the subject in which stimulus detection spans a range personalized for the subject from easily visible to subthreshold visible.
36. The method of feature 32, wherein a series of cells are provided to the subject in which stimulus detection spans a range from easily visible for a person with 20/200 vision to subthreshold for a person with 20/10 vision.
37. The method of feature 32, wherein the subject's visual function based on performance on previous grids is atypical and stimulus dimensions are extended.
38. The method of feature 32, wherein the stimulus luminance and background luminance are adjusted to measure the subject's performance across a range of luminance and contrast conditions.
39. The method of feature 32, wherein luminance intensity and size of a cell boundary are adjusted to generate a glare source.
40. The method of any of features 32-39, wherein the method is used to determine and/or monitor a visual correction of the subject.
41. The method of feature 29, wherein the neurologic disease or condition is selected from the group consisting of concussion, traumatic brain injury, traumatic eye injury, and other types of neurological trauma; cognitive impairment, Autism Spectrum Condition (ASC), Attention Deficit Disorder (ADD), and other high level neurological disorders; and schizophrenia, depression, bipolar disorder, and other psychotic disorders.
42. The method of feature 41, wherein the neurologic disease or condition detected or monitored is selected from the group consisting of prosopagnosia, object agnosia, and affective disorders, and wherein a series of cells comprising face or object images are presented to the subject in which stimulus pairs comprising a first stimulus category and a second stimulus category are progressively blended, and wherein the subject's response comprises identifying for each cell whether the first stimulus category or the second stimulus category is displayed.
43. The method of feature 42, wherein the stimulus pairs comprise objects, animals, faces of different identity, faces displaying different emotion, and faces of different gender.
44. The method of any of features 29-43, wherein said detection and/or monitoring of the progression of an ophthalmic condition, an optometric condition, or a neurologic disease or condition comprises analysis of a pattern of sensitivities as shown in Table 1.
45. A device for performing the method of any of the preceding features, the device comprising a graphic display, a user input, a processor, a memory, optionally wherein the processor and/or memory comprise instructions for performing said method.
The present technology provides rapid and easy-to-administer methods for comprehensively assessing visual function as well as neurological function in humans.
One aspect of the technology is a method for testing a visual or neurological function of a human subject. The method includes the steps of: (a) providing a device having a graphical display and a user input device or function; (b) displaying sequentially a set of grids on the display, each grid comprising a plurality of cells in which a visual stimulus can be displayed; (c) receiving subject responses through the user input, wherein the subject identifies at least which cells contain the stimulus; and (d) analyzing the subject responses from each grid using a sensitivity function to obtain the subject's responsiveness to each of the stimuli in the set of grids. The sensitivity function describes the probability of the subject reporting the stimulus as a function of stimulus intensity. Optionally, the method also can include the further step of: (e) analyzing the subject's responsiveness to two or more different stimuli of the set of grids to obtain a pattern of responsiveness of the subject. As a further option, the method can still further include the steps of: (f) comparing the subject's pattern of responsiveness to one or more known patterns of responsiveness; and (g) identifying a presence or absence in the subject, or a likelihood thereof, of one or more visual or neurological conditions.
The method can be carried out on any general purpose computational device, such as a personal computer, laptop, tablet, or mobile phone, or on a special purpose device that has a graphical display and a user input function, such as a touch screen, pointing device such as a mouse or trackball, keyboard, buttons, or a microphone or headset together with speech recognition software. The device can further include one or more speakers for presenting instructions, commands, or auditory stimuli. The method is well suited for self-administration by the subject in any environment, such as home or office, without or with assistance by a trained person. The computer or other device can optionally transmit results of the subject's test to a remote facility for further analysis or for attention by a medical or optometric professional. The computer or special purpose device can be portable and battery operated for use in the field, such as at a sports event or on a battlefield. A computer or other device used for conducting the tests will include a display, input device, processor, memory, and preferably a radio transceiver for wireless communication. The processor and/or memory can be pre-loaded with software for implementing the test, calculating results, storing results, and transmitting results to another computer or other device.
The display is preferably a color display capable of high resolution graphical representation of images with or without animation (changes in the image over time). A preferred presentation of the test images is in the form of a grid or matrix composed of a number of cells that are similar or identical in size and shape, although images also can be presented single on the display or in other arrangements, including random or patterned arrangements. A grid typically presents a rectangular ordering of cells, and the cells can themselves be rectangular, square, round, elliptical, or have another shape. Such a rectangular array can have any desired number of cells arranged in rows and columns. For example, a grid can have cells arranged in a 2×2, 2×3, 2×4, 2×5, 2×6, 3×2, 3×3, 3×4, 3×5, 3×6, 4×2, 4×3, 4×4, 4×5, 4×6, 5×2, 5×3, 5×4, 5×5, 5×6, 6×2, 6×3, 6×4, 6×5, or 6×6 grid (rows×columns), or another arrangement. The format of grids within a set can be the same or different. The cells of a grid can be arranged in any desired two-dimensional arrangement, such as a square or rectangular grid containing 3, 4, 6, 8, 9, 10, 12, 15, 16, 20, 24, 25, 27, 30 or more cells, or a different arrangement. A set can include any number of grids, such as 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 15, 20, 30, 40, 50, or more grids.
The cells of a grid preferably will display either no stimulus or a single type of stimulus, wherein the intensity of the stimulus varies from cell-to-cell containing the stimulus. The set of grids can contain a different stimulus in each grid, or two or more grids of the set can contain the same stimulus presented identically or differently in cell arrangement or intensity range. A set of grids can contain any number of different stimuli, such as 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 or more different stimuli. A set of grids can also present one or more types of different stimuli over different ranges of intensity, the range presented in a single grid or spread out over two or more grids.
A stimulus can be a characteristic of a visual object that is perceived by the subject. For example, the perceived characteristic of a stimulus can be absence, presence, luminance, contrast, color, depth, motion, flicker, spatial form, object recognition, object shape, object form, object size, facial recognition, facial feature recognition, feature position, feature angle, spatial resolution, noise-defined depth, and sparse-pattern depth of a visual object or pattern, and can be optionally supplemented by change over time or space or the addition of an auditory stimulus. Preferably the perceived characteristic is the same for all cells of a grid in which it appears, and the strength, intensity, or detectability by the subject varies within the grid. Preferably the range of stimulus strength, intensity, or detectability encompasses barely detectable characteristics as well as easily detectable characteristics. The position in a grid of cells containing or not containing a stimulus, or containing varying intensities of the stimulus, can be random, or can be selected according to a desired pattern. Stimuli or the range of their intensity also can be displayed adaptively, such that the subject's sensitivity is calculated on the fly and used to alter the stimulus or range of intensity in subsequent cells, grids, or sets of grids. Examples of grids with visual stimuli are shown in
Optionally, cells can be displayed individually, with other cells of the grid not displayed, so as to avoid distracting the subject or interactions between cells in the eyes or mind of the subject. Pattern perception and motion perception stimuli can be affected by the presence of multiple stimuli. Sensitivity to motion and pattern stimuli can be superior in the peripheral visual field than in central vision. This means that a target may be visible in a cell away from the current gaze direction and no longer be visible when the subject directly views the cell, which can be confusing. Additionally, sensitivity to the stimulus in a given cell may be affected by stimuli in adjacent cells for certain tasks. To avoid such effects, a hidden cell paradigm can be implemented in which only the cell beneath the mouse is presented at any time (see
In addition to selecting any cells where the subject detects a target stimulus, additional response parameters can be recorded and scored. For example, the subject's confidence in their response can be indicated by clicking in a different location in the cell. For example, an observer can indicate low confidence of their response by clicking to the left side of the cell, or high confidence in their response by clicking on the right side of the cell. The two dimensions of the cell (vertical and horizontal, or radial and rotational) can be used to score separate parameters (e.g., apparent age of a face on the horizontal axis and apparent gender on the vertical axis).
After the subject has selected all cells that they think contain a stimulus, a computer algorithm can classify the response in each cell as follows:
Signal detection theory can be used to estimate sensitivity for each stimulus intensity. For example the function called d-prime (d′), shown below, can be used. The data are fit with a psychometric function to generate an updated estimate of d-prime (d′) for this observer for this task:
where τ is the sensitivity threshold (i.e., the stimulus intensity where d′=1), β is the upper asymptote of the saturating function (i.e., the stimulus intensity where d′=5), s is signal intensity, and γ is the slope of the function.
The d-prime function can be related to the psychometric function of the probability of reporting the presence of a stimulus as a function of its intensity (illustrated in
Ψyes(s)=1−G(z(1−Ψyes(0)−d′(s))
where G(s) is a cumulative Gaussian function, and Ψyes(0) is the false alarm rate.
The psychometric function can be computed on-the-fly for each matrix, and can be used to estimate a stimulus for which d-prime=0.1, which is extremely difficult to detect, and a stimulus intensity for which d-prime=4.5, which is very easy to detect.
An alternative sensitivity function is an Orientation Error function, defined as:
where τ is a sensitivity threshold, θi is internal orientation uncertainty, s is signal intensity and γ is the slope of the function.
Other functions include a cumulative Gaussian function, defined as:
where τ is a sensitivity threshold, pguess is the probability of a correct response for a guess (equal to the reciprocal of the number of alternative response choices), s is signal intensity and γ is the slope of the function.
The stimuli in each matrix can be chosen to span the range from difficult to easy. The number of signal stimuli preferably is random on each trial, and their position in the matrix preferably is random on each trial. The stimulus intensities on the first trial can be based on data from previous observers (e.g., typical sensitivity for comparable subjects) or can be based on physical stimulus limits (e.g. color gamut of a display). The stimulus intensities on subsequent stimuli can be based on the estimate of d-prime or another psychometric function computed for all previous grids for the current subject, which optimizes the test for each subject. At the end of the test, visual sensitivity can be computed from all responses to all cells.
An implementation of the present technology can utilize the Psychtoolbox open source software (see psychtoolbox.org/credits). Other software also can be used.
Any of the tests, algorithms, systems, or devices described in WO2013170091 A1, which is hereby incorporated by reference in its entirety, can be used in the present technology.
Psychometric formulas can be used to determine the threshold of stimulus characteristics, such as contrast or color, using a single stimulus dimension, such as spatial frequency for contrast, or hue for color. In many cases, functional forms for such thresholds are known. For example, a log parabola describes contrast sensitivity as a function of spatial frequency (
Summary data outputs from subject testing can include area under the log contrast sensitivity function, area under the threshold-versus contrast function, area under the chromatics sensitivity function, and volume under the spatio-temporal contrast sensitivity function when the functional form of the relationship between stimuli and sensitivity is known.
As an example, visual contrast sensitivity varies in two dimensions as a function of spatial frequency and contrast, according to the following relationship:
As another example, visual contrast sensitivity varies in three dimensions as a function of spatial frequency, temporal frequency, and contrast, according to the following relationship:
(D. Kelly, Motion and vision. II. Stabilized spatio-temporal threshold surface, JOSA, 69, pp. 1340-1349 (1979).
As yet another example, visual color sensitivity varies in two dimensions as a function of color saturation (k) and hue angle (h), according to an ellipse centered on the reference point (h, k):
(W. R. J. Brown and D. L. MacAdam, “Visual Sensitivities to Combined Chromaticity and Luminance Differences*,” J. Opt. Soc. Am. 39, 808-834 (1949)).
The above described methods can be used to assess visual acuity,
The present method can be extended to estimate suprathreshold discrimination performance, as well as threshold performance.
Some neurological disorders selectively affect face processing. For example, the identification of individuals is impaired in prosopagnosia, and the recognition of emotional affect can be impaired in people with autism spectrum disorder. The technology can be extended to include target and non-target social cognition signals in order to detect the presence, progression, or remediation of social cognition impairment.
The present technology has the potential to address limitations of the prior art in the following ways. First, the test is very quick, so one or more tests can be administered conveniently in a single clinic visit or screening. Second, the same paradigm can employed using a plurality of different stimulus types, so that a comprehensive assessment of the function of different brain areas can be completed quickly and efficiently. Third, the same easy- to-understand protocol can be employed for all tests, so human subjects of all abilities can complete the test and do not need to learn a new protocol for different tests. Fourth, the test includes easy and difficult stimuli simultaneously, so subjects do not have to remember the test signal for the current task. Fifth, the test can be self-administered, so human subjects can complete tests at home, at work, when travelling, or on a sideline of a sporting event, without traveling to a clinic.
The present technology can be used to detect and/or monitor the progression of a range of ophthalmic diseases, including age-related macular degeneration, glaucoma, and amblyopia. Neurologic diseases or conditions, such as concussion, also can be diagnosed and/or monitored using the present technology. The technology can make a significant contribution to drug development for ophthalmic and neurologic diseases. The tests provided herein also can be used as an endpoint for optometric correction, such as correction involving contact lenses, spectacles, or intraocular lenses. The tests also can be used as an endpoint for treatment of neuro-ophthalmic disorders, including traumatic brain injury, head trauma, autism spectrum disorder, and attention deficit disorders. Table 1 summarizes how the technology can specifically diagnose a variety of visual and neurological conditions.
Physiology, 20(1), 57-82.
Methods. Optometry and Vision Science, 70(11), 903-913.
The present technology includes the following advantageous features:
i) Testing is extremely quick, and at least 10 times faster than comparable tests. A single comprehensive test can be performed in about 30 seconds, compared with 18 minutes with alternative methods.
ii) The testing method can be generalized to a broad range of tests for different visual pathways. Other tests require subjects to learn new stimuli and tasks for each test. Because the same method is employed in a broad range of tests, a more comprehensive assessment of the patient can be carried out.
iii) The test is intuitive and easy to administer. Other tests (e.g., letter acuity charts) require subjects to learn specific test items (e.g., the western alphabet), which complicates testing of either young, or non-western, or cognitively-impaired subjects.
iv) The test is adaptive. Each grid can be updated based on responses to successive stimuli, and each grid can contain both challenging and easy stimuli, which ensures the presence of exemplar stimuli and reduces memory demand for the task.
v) The test intentionally includes catch and null stimuli, which prevents cheating and eliminates the frustration of guessing that is encountered in other tests.
vi) The test can be self-administered and does not require a clinician or technician to proctor the test.
vii) The test can be administered away from a clinic, such as in the home, at a sports arena or battlefield, or for ecological momentary assessment. The ease and rapidity of administration can help identify impairment of visual pathway deficits earlier than alternative methods and can lead to earlier intervention and improved treatment outcomes.
The methods described herein can be implemented in any suitable computing system. The computing system can be implemented as or can include a computer device that includes a combination of hardware, software, and firmware that allows the computing device to run an applications layer or otherwise perform various processing tasks. Computing devices can include without limitation personal computers, work stations, servers, laptop computers, tablet computers, mobile devices, wireless devices, smartphones, wearable devices, embedded devices, microprocessor-based devices, microcontroller-based devices, programmable consumer electronics, mini-computers, main frame computers, and the like and combinations thereof.
Processing tasks can be carried out by one or more processors. Various types of processing technology can be used including a single processor or multiple processors, a central processing unit (CPU), multicore processors, parallel processors, or distributed processors. Additional specialized processing resources such as graphics (e.g., a graphics processing unit or GPU), video, multimedia, or mathematical processing capabilities can be provided to perform certain processing tasks. Processing tasks can be implemented with computer-executable instructions, such as application programs or other program modules, executed by the computing device. Application programs and program modules can include routines, subroutines, programs, scripts, drivers, objects, components, data structures, and the like that perform particular tasks or operate on data.
Processors can include one or more logic devices, such as small-scale integrated circuits, programmable logic arrays, programmable logic devices, masked-programmed gate arrays, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), and complex programmable logic devices (CPLDs). Logic devices can include, without limitation, arithmetic logic blocks and operators, registers, finite state machines, multiplexers, accumulators, comparators, counters, look-up tables, gates, latches, flip-flops, input and output ports, carry in and carry out ports, and parity generators, and interconnection resources for logic blocks, logic units and logic cells.
The computing device includes memory or storage, which can be accessed by a system bus or in any other manner. Memory can store control logic, instructions, and/or data. Memory can include transitory memory, such as cache memory, random access memory (RAM), static random access memory (SRAM), main memory, dynamic random access memory (DRAM), block random access memory (BRAM), and memristor memory cells. Memory can include storage for firmware or microcode, such as programmable read only memory (PROM) and erasable programmable read only memory (EPROM). Memory can include non-transitory or nonvolatile or persistent memory such as read only memory (ROM), one time programmable non-volatile memory (OTPNVM), hard disk drives, optical storage devices, compact disc drives, flash drives, floppy disk drives, magnetic tape drives, memory chips, and memristor memory cells. Non-transitory memory can be provided on a removable storage device. A computer-readable medium can include any physical medium that is capable of encoding instructions and/or storing data that can be subsequently used by a processor to implement embodiments of the systems and methods described herein. Physical media can include floppy discs, optical discs, CDs, mini-CDs, DVDs, HD-DVDs, Blu-ray discs, hard drives, tape drives, flash memory, or memory chips. Any other type of tangible, non-transitory storage that can provide instructions and/or data to a processor can be used in the systems and methods described herein.
The computing device can include one or more input/output interfaces for connecting input and output devices to various other components of the computing device. Input and output devices can include, without limitation, keyboards, mice, joysticks, microphones, cameras, webcams, displays, touchscreens, monitors, scanners, speakers, and printers. Interfaces can include universal serial bus (USB) ports, serial ports, parallel ports, game ports, and the like.
The computing device can access a network over a network connection that provides the computing device with telecommunications capabilities Network connection enables the computing device to communicate and interact with any combination of remote devices, remote networks, and remote entities via a communications link. The communications link can be any type of communication link including without limitation a wired or wireless link. For example, the network connection can allow the computing device to communicate with remote devices over a network which can be a wired and/or a wireless network, and which can include any combination of intranet, local area networks (LANs), enterprise-wide networks, medium area networks, wide area networks (WANS), virtual private networks (VPNs), the Internet, cellular networks, and the like. Control logic and/or data can be transmitted to and from the computing device via the network connection. The network connection can include a modem, a network interface (such as an Ethernet card), a communication port, a PCMCIA slot and card, or the like to enable transmission to and receipt of data via the communications link. A transceiver can include one or more devices that both transmit and receive signals, whether sharing common circuitry, housing, or a circuit boards, or whether distributed over separated circuitry, housings, or circuit boards, and can include a transmitter-receiver.
The computing device can include a browser and a display that allow a user to browse and view pages or other content served by a web server over the communications link. A web server, sever, and database can be located at the same or at different locations and can be part of the same computing device, different computing devices, or distributed across a network. A data center can be located at a remote location and accessed by the computing device over a network. The computer system can include architecture distributed over one or more networks, such as, for example, a cloud computing architecture. Cloud computing includes without limitation distributed network architectures for providing, for example, software as a service (SaaS).
As used herein, “consisting essentially of” allows the inclusion of materials or steps that do not materially affect the basic and novel characteristics of the claim. Any recitation herein of the term “comprising”, particularly in a description of components of a composition or in a description of elements of a device, can be exchanged with the alternative expressions “consisting essentially of” or “consisting of”.
This invention was made with government support under Grant Number EY029713 awarded by the National Institutes of Health. The government has certain rights in the invention.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2021/049250 | 9/7/2021 | WO |
Number | Date | Country | |
---|---|---|---|
63075084 | Sep 2020 | US |