The subject matter of this disclosure relates to ophthalmic instruments for measuring properties of an eye of a user.
There are several ophthalmic devices that measure properties of an eye of a user, such as intraocular pressure, a condition of the retina, topography of the cornea, etc. These devices may require alignment between the device and the user's eye. Traditionally, the alignment has been achieved based on input by a human (operator) such as a physician, a physician assistant or nurse who views the user's eye while the user's head is resting on a chin/head rest, where the operator manually moves the device into its proper position.
It would be desirable to have an ophthalmic device that can be positioned against an eye of a user while the device makes measurements of their eye, without requiring any assistance from an operator. This would reduce costs and enable the measurements to be taken outside of a physician's office. An aspect of the disclosure here is an ophthalmic device that presents a visual, alignment target at which the user is instructed to look. The user may be instructed to bring their eye into proximity of the device and then look for the target through a view port of the device. The target may be either a static image or a dynamic image that is presented by a display, e.g., a microdisplay, within the device. An alignment mechanism within the device automatically aligns an eye property measurement sensor in relation to the eye, to ensure accuracy of the measurements. The alignment is automatic in that it does not require input from an operator.
The alignment may require that the user can see the target with sufficient acuity. But many eye measurements do not allow the user to wear spectacles or contact lenses during the measurement, and the wide distribution of myopia and hyperopia in the population makes it difficult for all users to see the target at a high enough resolution (which is needed to ensure comfortable and timely alignment.) In addition, there are changes to accommodation range (the distance between the eye and the target for comfortable viewing) as a user ages, and so this also decreases the population that could see the target sufficiently well.
In accordance with one aspect of the disclosure here, a device for measuring eye properties has a device housing (e.g., that of a tabletop instrument or a handheld one) in which there is a sensor subsystem (sensor) that measures a property of the user's eye. An electronically controlled alignment mechanism to which the sensor is coupled serves to align the sensor to the eye of the user. Also in the device housing is a display that shows the target to the user's eye, and optics positioned in an optical path taken by the target (as it is being shown by the display) from the display to the eye of the user. The optics is configured to change accommodation by the eye, and it has adjustable focal length that enables the user to see the target with changing acuity. The alignment process performed by the alignment mechanism is triggered (to start or resume) in response to the processor obtaining an indication that the eye of the user is focused on the target (or that the user can see the target with sufficient acuity.) In this manner, the alignment which prepares the sensor to make a measurement of the eye property is likely to be faster and more accurate, which makes the eye examination process more efficient.
The above summary does not include an exhaustive list of all aspects of the present disclosure. It is contemplated that the disclosure includes all systems and methods that can be practiced from all suitable combinations of the various aspects summarized above, as well as those disclosed in the Detailed Description below and particularly pointed out in the Claims section. Such combinations may have advantages not specifically recited in the above summary.
Several aspects of the disclosure here are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which similar references indicate similar elements. It should be noted that references to “an” or “one” aspect in this disclosure are not necessarily to the same aspect, and they mean at least one. Also, in the interest of conciseness and reducing the total number of figures, a given figure may be used to illustrate the features of more than one aspect of the disclosure, and not all elements in the figure may be required for a given aspect.
Several aspects of the disclosure with reference to the appended drawings are now explained. Whenever the shapes, relative positions and other aspects of the parts described are not explicitly defined, the scope of the invention is not limited only to the parts shown, which are meant merely for the purpose of illustration. Also, while numerous details are set forth, it is understood that some aspects of the disclosure may be practiced without these details. In other instances, well-known circuits, structures, and techniques have not been shown in detail so as not to obscure the understanding of this description.
Referring now to
The sensor 3 may need to be aligned with the eye in order to produce the measurement of the eye property. For that reason, the sensor 3 is coupled to an alignment mechanism 4 (that is also in the device housing.) The alignment mechanism 4 may include an actuator that is electronically controlled by a processor 6, and that serves to move the sensor 3, or in other words actuate any moveable component of the sensor 3 (e.g., an emitter, a detector, or an optical component of the sensor 3), on command by the processor 6.
The processor 6 is configured or programmed (e.g., when it is executing instructions stored in memory—not shown) to signal the alignment mechanism 4 to start or resume either an open loop or a closed loop process, to align the sensor 3 to the eye of the user. In one aspect of the disclosure here, the processor 6 may only do so in response to obtaining some indication that the eye of the user is focused on the target that is being displayed in the display 7, or in other words the user can see the target with sufficient acuity. This helps ensure that the alignment process correctly and quickly positions the sensor 3 for measuring the eye property. The target may be a graphical object, or an image of a real object, which is being shown by a display 7 in the device housing. The display 7 may be a microdisplay, e.g., a miniature display with a diagonal display size of less than 2 inches. The target may be either a static image or it may be an active image shown by the display 7. Status information may also be presented by the display 7, e.g., a countdown clock, which eye is being measured, etc. Optics 8 in the device housing is positioned in an optical path that is taken by the target (as the target is being shown by the display 7.) The optical path is from the display 7 to the eye of the user as depicted in the figures.
The optics 8 is configured to change accommodation by the eye and has adjustable focal length that enables the user to see the target with changing acuity. Changing accommodation lets the user see the target more easily, particularly in cases where the display 7 is positioned no more than two hundred millimeters from the eye (when the device housing is in proximity to the eye.) In one aspect, the optics 8 includes a convex solid lens that is motorized to be moveable as controlled electronically by the processor 6. That is a general description, of course, in that it covers cases where the optics 8 includes a series of two or more lenses (e.g., convex and concave) in which one or more of the lenses whose position in an axial direction can be adjusted and controlled electronically by the processor 6. The axial position could be adjusted by making the lens moveable in the axial direction, or equivalently making the display 7 moveable in the axial direction. In another aspect, the optics 8 includes a fluidic lens whose shape is controlled electronically by the processor 6 (to change the focal length.)
In one aspect, the processor 6 is configured to obtain the indication that the eye of the user is focused on the target, by prompting the user to indicate when the user can see the target clearly, while the processor 6 is signaling the optics 8 to change focal length. This prompting may be performed by the processor 6 signaling an audio subsystem (not shown) to instruct the user, “Please press the button or respond verbally, when you can see the target clearly.” The audio subsystem may have a microphone in the device housing, and the processor 6 processes an audio signal output by the microphone to detect audible input from the user as the indication that the user can see the target clearly.
As an alternative, the focal length of the optics 8 may be manually adjustable, by the user for example turning a knob in the device housing. The processor 6 in that case may also be configured to obtain the indication that the eye of the user is focused on the target, by receiving manual (e.g., a button press) or audible input from the user so that the user can see the target clearly. For instance, the user may be instructed to manually adjust the optics 8 using their fingers until they can see the target clearly at which point the user will press a button or speak a phrase which the processor 6 interprets as indicating that the user can see the target clearly.
In another aspect, the device 2 contains an eye tracking subsystem in the device housing. The processor 6 in that case is configured to process eye tracking data, produced by the eye tracking subsystem, to determine whether the eye is looking at the target that is being shown in the display 7, and in response the processor signals the alignment mechanism 4 to perform the alignment process.
In another aspect, referring now to
Referring now to
In one aspect of the method in
As an alternative to the processor 6 signaling a motorized actuator or fluidic lens of the optics 8 to change focal length, the optics 8 may have a manually adjustable focal length, e.g., a focus knob that is adjustable by the user's fingers. The user's indication of the target coming into focus may also be a manual input, e.g., as a button pressed by the user.
In another aspect of the method of
While certain aspects have been described and shown in the accompanying drawings, it is to be understood that such are merely illustrative of and not restrictive on the invention, and that the invention is not limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those of ordinary skill in the art. For example, although the processor 6 may also be integrated within the device housing (along with the sensor 3, the display 7, the alignment mechanism 4 and the optics 8) in some instances part of the functionality or operations performed by the processor 6 can be performed by another processor that is in wired or wireless communication with the processor that is in the device housing. The other processor may be that of a laptop computer, a tablet computer, a smartphone, or even a website server. The description is thus to be regarded as illustrative instead of limiting.
This nonprovisional patent application claims the benefit of the earlier filing date of U.S. Provisional Application No. 63/376,187, filed Sep. 19, 2022.
Number | Date | Country | |
---|---|---|---|
63376187 | Sep 2022 | US |