The present invention relates generally to control apparatus.
In illustrative implementations of this invention, a human user mechanically moves one or more moveable parts in a handheld controller, and thereby optically controls a mobile computing device (MCD). In illustrative implementations, the optical control is implemented as follows: A camera onboard the MCD captures images. The images show the motion of the moveable parts in the handheld controller. A camera onboard the MCD analyzes these images to detect the motion, maps the motion to a control signal, and outputs a control signal that controls a feature or operation of the MCD.
In some implementations of this invention, the mobile computing device (MCD) comprises a smartphone, cell phone, mobile phone, laptop computer, tablet computer, or notebook computer.
The handheld controller includes one or more moveable parts that undergo mechanical movement, relative to the controller as a whole. For example, some of the moveable parts comprise I/O devices (e.g., buttons, dials and sliders) that a human user touches. Other moveable parts comprise parts that are not directly touched by a human user, but undergo mechanical motion, relative to the controller as a whole, that is actuated (e.g., through gears, linkages, or other motion transmission elements) by movement of the I/O devices.
For example, in some cases, a human user rotates a dial on the handheld controller. In some cases, the dial is the pinion in a rack and pinion, such that rotation of the dial actuates linear motion of a rack inside the handheld controller.
A camera in the MCD captures visual data regarding all or part of these moveable parts, while (optionally) one or more light sources in the MCD illuminate the MCD. A computer in the MCD analyzes this visual data to compute position or motion of moveable parts. Based on the computed position or motion of moveable parts, the computer outputs control signals to control operation of the MCD. For example, in some cases, the control signals control light patterns that are displayed by the MCD.
In many implementations of this invention: (1) the handheld controller does not include any electronics, motor, engine or other artificial actuator; and (2) the handheld controller does not have a wired electrical connection to the MCD. As a result, in many implementations, the handheld controller is very inexpensive to manufacture. For example, in some cases the handheld controller comprises plastic, with no electronics.
Advantageously, the handheld controller allows a human user to input complex commands to a MCD by simple mechanical motions. This is particularly helpful at times when all or a portion of the MCD's display screens are being used for another function (such as testing for optical aberrations of a human eye or cataracts) and are not available as a graphical user interface.
In some implementations, the controller is used to optically control an MCD, while the controller and MCD are attached to each other, and a screen onboard the MCD outputs images that are viewed by the human user as part of an eye test (e.g., a test for refractive aberrations of the user's eyes).
The description of the present invention in the Summary and Abstract sections hereof is just a summary. It is intended only to give a general introduction to some illustrative implementations of this invention. It does not describe all of the details and variations of this invention. Likewise, the description of this invention in the Field of Technology section is not limiting; instead it identifies, in a general, non-exclusive manner, a field of technology to which exemplary implementations of this invention generally relate. Likewise, the Title of this document does not limit the invention in any way; instead the Title is merely a general, non-exclusive way of referring to this invention. This invention may be implemented in many other ways.
The above Figures show some illustrative implementations of this invention. However, this invention may be implemented in many other ways. The above Figures do not show all of the details of this invention.
In illustrative implementations, a handheld controller is used to control operations of a mobile computing device to which the handheld controller is releasably attached. The handheld controller includes a set of mechanical user interfaces, such as buttons, scroll wheels, knobs, ratchets, sliders and other mechanical components. The handheld controller also includes a set of visual features that are either on, or part of, moveable parts. These moveable parts are either the mechanical user interfaces or components that are mechanically actuated by movement of the mechanical user interfaces. A camera in a mobile computing device is used to detect position or motion of the visual features. Based on this position or motion data, a computer onboard the MCD outputs signals to control operation of the MCD, including the graphics of the device display.
In the examples shown in
In the example shown in
The user holds the controller in one or both hands during operation. In some use scenarios, the user holds the controller with one hand, and uses the other hand to manipulate user interfaces. In some use scenarios, the user uses both hands for securely holding the controller while simultaneously using both hands to manipulate user interfaces. In some use scenarios, the user holds the controller in one hand, while manipulating interfaces with the same hand. In some use scenarios, the controller is be held by one person and controlled simultaneously by a second person (e.g. the second person manipulates the mechanical user interfaces of the controller).
In some cases, MCD 204 comprises a cellular phone (e.g. a smart phone). The MCD 204 includes a built-in camera or light sensor 208.
The handheld controller 202 includes a housing 219. In addition, the handheld controller 202 also includes mechanical user interfaces that the user manipulates. For example, in some cases, the user interfaces include turn dials 215, sliders 216, wheels 217, or buttons 218.
The handheld controller also includes an attachment mechanism that (a) easily attaches an MCD to the handheld controller, and (b) easily releases the MCD from the handheld controller. Over the course of the handheld device's useful life, the handheld controller is repeatedly attached to, and then detached from, an MCD. During times when the MCD is attached to the handheld controller via the attachment mechanism, the position of the handheld controller relative to the MCD is fixed. The handheld controller includes a window 206 through which a user views a display screen 209 of the MCD, when the controller 202 and MCD 204 are attached to each other.
In the exploded views of
In
In
Thus, in
Depending on the particular implementation, a variety of different attachment mechanisms are used to releasably join the controller 202 and MCD 204 together. For example, in some cases, an attachment mechanism that is part of the handheld controller 202 comprises: (1) a clip that clips over the MCD; (2) one or more flexible bands or tabs that press against the MCD; (3) retention features that restrain the MCD on at least two edges or corners of the MCD (including retention features that are part of an opening in the controller); (4) a slot, opening or other indentation into which the MCD is wholly or partially inserted; (5) a socket into which the MCD is partially or wholly inserted into the controller; (6) a door or flap that is opened and closed via a hinge, which door or flap covers a socket or indentation into which the MCD is inserted; (7) a mechanism that restrains motion of the MCD, relative to the controller, in one or more directions but not in other directions; (8) a mechanism (e.g., a “snap-fit”) that snaps or bends into a position that tends to restrain motion of the MCD relative to the controller; or (9) one or more components that press against MCD and thereby increase friction and tend to restrain motion of the MCD relative to the controller.
A human user employs the mechanical interfaces of the controller to optically control one or more features or functions of the MCD, including to control MCD functions, to trigger device events, to launch or control applications that run on the MCD, or to display animated graphics on the MCD's display screen. A computer onboard the MCD recognizes mechanical interfaces that are in the handheld controller and in the camera's field of view, links a change in interface position to an applied user action, and generates control commands. For example, in some cases, a wheel is rotated over a given time period, and the camera detects the relative or absolute displacement of the wheel (e.g. an angular change) and then generates a command that is subsequently executed.
In illustrative cases, an optical link is established through light interactions between the controller and the MCD. A light source, originating from the MCD, illuminates the mechanical user interfaces, which subsequently reflect a portion of the original light back to the MCD. The reflections are recorded by one or more light sensors on the MCD, such as a CCD camera. A computer onboard the MCD analyzes the recorded light signals (e.g., to determine light source shape or intensity), and based on data regarding these light signals, generates control signals that are subsequently executed by the MCD.
In many implementations, the MCD is attached to the controller such that a display or other light source on the MCD faces towards user interfaces of the controller. In some cases, a secondary display or light source is present on the MCD, and, when the MCD and controller are attached to each other, one display or light source on the MCD faces outwards to serve as a graphical or visual user interface for interacting with human users, and the second display or light source on the MCD faces the user interfaces and serves as a controllable light source to illuminate the controller.
In the example shown in
In illustrative implementations, built-in light sensors in the MCD capture positional information regarding the mechanical user interfaces. For example, in some cases, a built-in camera is used to record light that reflects from the user interfaces. The camera includes one or more lenses and is located on the front or back of the MCD. In some cases, other sensors (such as accelerometers, illumination sensors, proximity sensors, or single-pixel detectors) are utilized in the system. In some cases, the camera or light sensors include electronic circuits, optical components, and embedded software to aid image capturing and processing functions.
In the example shown in
In some embodiments, the handheld controller has slots or openings such that ambient light enters and acts as an alternative or enhancing light source. Similarly, in some cases, the controller has larger openings, such that some or all of the user's fingers fit through them. In that case, user interfaces are located on the inside of the hollow controller for the user to access and control. In some cases: (a) the handheld controller is structurally minimal with sufficient structural support to hold the mechanical user interfaces and the MCD in place; and (b) ambient light is present inside the controller, even when the controller is attached to the MCD through a rigid physical connection.
In some cases, the handheld controller is manufactured from one or more lightweight and or biodegradable plastics. In many embodiments, the controller contains no electronic or metal components and is constructed entirely from plastic through molding techniques or 3D-printing systems.
In the example shown in
In many cases, the fastest repetition rate of the method is defined by the component with the slowest operational rate. In some cases, for example, the slowest operational rate is: (a) the frames-per-second output of a graphical display; (b) a delayed mechanical response of components in the control canvas to physical motion applied by the user; or (c) a frames-per-second rate at which control signals are detected by the camera.
In some cases, a system (which comprises the MCD and controller) operates with a given set of initial parameters that are defined prior to the system application. In some cases, the system also changes parameters of certain components and their subcomponents during runtime dynamically or through feedback from another component's reading. For example, in some cases, parameters of the light source include the intensity of the emitted light, the rate of light output (e.g. rapid or varying on/off light triggers), spatial coding, or a combination of all within a given time interval. The color output is defined a priori or varied during runtime. In some cases, parameters of the light detector include the rate of capture (frames per second), the sensitivity of the detector during each light capture interval, or the sensitivity to a given color (i.e. wavelength).
Visual features 420 are affixed to, or are part of, components in the control canvas, including: (a) one or more components that are moveable relative to the housing of the controller, and (b) one or more components that have a fixed position relative to the housing of the controller. A visual feature 420 that is placed on a moveable component moves when that component moves, and thus facilitates motion tracking of that component.
The camera of the MCD images the control canvas of the handheld controller. Preferably, the MCD is attached to the controller so that the field-of-view of the camera coincides with the control canvas area. In some cases, the control canvas is partially occluded from the camera field-of-view. Mechanical manipulation (by a human user) of a user interface onboard the handheld controller causes the orientation or position of one or more of the visual features to change over a given time period. This causes a spatial and temporal change in the control canvas's layout, effectively changing the visual content, which the camera of the MCD records. A computer determines the visual content of the control canvas by the instantaneous location of the user interfaces, canvas elements, and their corresponding visual features. In some cases, the computer detects “background” areas in the control canvas that are void of visual features used for control.
In the example shown in
In illustrative implementations, visual features 420 are either control features or calibration features. Control features are located on components that are moved by the mechanical user interfaces, which are in turn mechanically moved by human input (e.g., pressing a button, sliding a linear slider or turning a dial). A computer analyzes a camera image stream in order to track motion of the control features, maps the motion to control signals, and outputs control signals to modify a graphic display of the MCD in real time. Thus, mechanical movement of a user interface of the handheld controller causes real-time changes in the graphic display onboard the MCD. A computer onboard the MCD performs a frame-by-frame analysis of feature movements in the camera images. The computer calculates relative control feature displacements, color variances, or intensity changes that occur over time in the frames captured by the camera. The computer recognizes features and detect changes in spatial pixel intensity from the camera's available monochrome or color channels. The computer tracks spatial displacement or variation of each feature, including linear or rotational displacement, or changes in the spatial size (area) or relative separation of the feature.
In illustrative implementations, calibration features are used for calibration procedures, positional reference, signal quality check, or device recognition. The calibration features are located on components that either have a fixed or moving position, relative to the controller housing. The position of calibration features within the control canvas are related to the position of control features and serve as “anchor” points in order to compute the relative differences between control and calibration features.
Calibration features are desirable, in order to calibrate for physical variations in the controller or MCD, including variations that occur during initial fabrication or during use. For example, in some cases, calibration features are used to accommodate: (a) variations in MCD placement relative to the controller after attaching the controller to the MCD; (b) variations that occur during operation due to mechanical shock or hardware deformation from the user input; or (c) differences in camera optics between MCD models, and resulting differences in image frame content and orientation. In illustrative implementations, calibration features provide visual cues regarding the relative position, orientation, path, or area in which to track control features.
In some cases, a computer analyzes the camera frames and determines position of control features and calibration features relative to each other or relative to the controller itself For instance, in some cases: (a) a rotary dial is a component of a linear slider, such that linear position of the rotary dial varies according to the position of the linear slider; and (b) displacement of the linear slider is detected prior to analyzing rotation of the rotary dial.
In some cases, the material and color of the visual features (e.g., control features or calibration features) facilitate optical tracking. In some cases, the signal strength of visual features in images recorded by the MCD camera is a function of the feature's material composition. Preferably, the contrast between a visual feature and the surrounding background is maximized. For example, in some cases, contrast is enhanced by using materials such as retroreflectors, mirrors, or metallics.
In some cases, material properties of a visual feature are selected such that the visual feature reflects light in a desirable way. This is illustrated in
In
In
In
In the example shown in
In some implementations, multi-colored feature patterns simplify the distinction between different movable mechanical interfaces and the distinction between control or calibration features. In some implementations, scattering and absorption-specific pigments are used to differentiate visual feature types or their assigned roles. In some cases, optical polarization combined with pigments and optical properties also are used in order to distinguish between visual features.
A variety of optical patterns may be used for the visual features (e.g., control features or calibration features). In some cases, a circular dot is used (e.g., for displacement tracking of a mechanical component). In some cases, an elongated or line-like feature is used (e.g., for tracking rotation). In some cases, a calibration feature covers the control feature's travel range. For example, in some cases, such a calibration feature (which designates a specific area for feature detection), comprises a rectangle, a ring, or a ribbon that outlines a travel range of a control feature. In some cases, a checkerboard is used for calibration. In some cases, a barcode is used.
In illustrative implementations, a computer processes images of mechanical inputs of the hardware (including analyzing changes in the control canvas, mapping these changes to control signals, and outputting the control signals) at extremely short processing times. This very rapid processing is facilitated by using these optical patterns.
In
In
In
In
In illustrative implementations, the MCD includes one or more point light sources and one or more spatial light sources, each of which are controlled by a computer onboard the MCD. For example, in some cases, a point light source onboard the MCD comprises a high intensity LED units used for flash photography. In some cases, a spatial light source onboard the MCD comprises a raster graphics display, liquid-crystal display (LCD), light-emitting diode (LED) display, organic-light-emitting diode (OLED) display, or electronic ink (E Ink) display. This invention is not limited to any particular type of light source. Any point or spatial light source that illuminates the control canvas may be used.
In exemplary embodiments of this invention, the MCD screen emits light to illuminate visual features of the handheld controller. The light is spatially uniform or has a spatial intensity gradient. In some cases (e.g.,
In illustrative implementations, the intensity of the light from the MCD is constant, time varying, or a mixture of both. In some cases, the update frequency of the optical link (between the MCD camera and visual features of the controller) is limited by the light detector's reading rate (i.e. frame rate). In some cases, for time varying implementations, the light from the MCD is periodically on/off-pulsed or alternated between selected intensity ranges. In some implementations, the light source and light detector are time-synchronized. Given the short distances involved and the speed of light, the time that it takes for light to travel from the MCD to the visual features and back is so short that it is treated as instantaneous, for computational purposes. With this instantaneous travel time, the beginning of every source pulse period marks the time when the light detector is triggered for signal acquisition.
In some cases, timing implementations are enhanced by using a light source that changes positions over time, as depicted in
In illustrative implementations, system calibration is used to provide a stable and high quality control link between the controller and the MCD. Knowing the relative spatial positions of the visual features in an image frame and the allowed range of their movement paths is desirable for rapid processing. Calibration is desirable because the optical properties of an MCD camera vary between different MCD models, series, and makes. Among other things, differences in optical lenses, CCD chip size, chip light sensitivity, optical axis relative to the device, and camera location may cause large variations between the spatial location and size of features within images taken from the different MCDs.
In illustrative implementations of this invention, calibration is performed initially and during operation of the system. In many cases, calibration is performed on the program's first cycle to collect initial parameters of the system. However, in some use scenarios, aspects of the system change during runtime such as positional variance of the MCD with respect to the control canvas. In these scenarios, it is useful to trigger a calibration step on the next program cycle. In some cases, certain calibration steps are performed on every program cycle, while others are performed sparsely or only once.
In illustrative implementations, a camera is used as a light detector that provides the calibration or feature detector a new raster image on every new program cycle. The image contains visual data regarding the control canvas, from which the positional information of the various control and calibration features are extracted by a computer. For example, in some cases, a computer uses well established image processing algorithms to find calibration and control features and to record their positions with respect to the raster image coordinates. For example, in some cases: (a) a certain visual feature in the control canvas is known to be a round dot; (b) a computer onboard the MCD analyzes the image with a blob-detector algorithm to find the general location of the dot; and (c) the computer calculates the centroid of the pixels corresponding to the dot to achieve subpixel positional accuracy, thereby more accurately determining the center location of the dot with respect to the image coordinates.
The function p=f(d) 872 or lookup table is predefined or is determined through one or more system calibration methods. In many implementations, a predetermined function p=f(d) is defined and then combined with a scaling factor c (e.g. pixels/millimeters) that is determined through system calibration. For example, as a user presses on an I/O and causes, by mechanical pressure, control features to move along a physical path in the control canvas, a camera detects the movement, and a computer onboard the MCD outputs control signals to cause a graphic image on the MCD display to move by a displacement that is scaled by c in distance. In many implementations: (a) the function p does not represent a one-to-one positional mapping from control canvas to MCD display; and (b) the function p instead skews the control feature path, rotate the path about a point, invert movement directions, or cause the graphics to move along an entirely different path characteristic than that of the control feature.
In some cases, the mapping function p=f(d) 872 is finite or periodic. A mechanical slider that moves along a fixed path causes the corresponding control feature to change its position by path change d 854 as illustrated in
In some cases, the mapping function p=f(d) is applied to one or more graphic features. That is, in some cases, a single control feature controls the positions of multiple graphic features. Alternatively, in some cases, multiple control features drive their own mapping functions in an ensemble that controls the position of one or more graphics simultaneously. In some cases, a computer dynamically alters function p during system operation via system calibration.
In some cases, a computer recognizes errors in detection of control features (such as detecting control features that do not match a defined path or failing to recognize visual features), and then takes precautionary steps, such as determining whether (a) hardware (e.g., a visual feature) in the controller is broken, (b) the MCD is damaged, or (c) the connection between the MCD and the controller is damaged. In some cases, a computer also outputs control signals to cause an I/O device to notify a human user to take actions to correct the problem.
In many implementations, the exact path of control features is not known prior to system operation. Image calibration is performed to determine the locations and paths of control features. In some cases, image calibration removes distortions associated with the optical quality of the camera assembly and perspective deformations, and allows the interchangeability of MCD makes, series, and models within a single hardware attachment unit, or vice versa.
In some cases, a control feature moves along a finite path with given start and end points, as shown in
In some cases, calibration features are positioned at a known offset from a control feature path, as shown in
In some cases, one or more control features are placed on top of calibration features. In some cases, one or more calibration features trace the entire path of a control feature (as shown in
In some cases, an elongated calibration feature is positioned such that the calibration feature is offset from and parallel to an elongated path of a control feature. In some cases, the topological range (i.e. feature elevation relative to the camera's perpendicular plane) is determined by positioning calibration features at the apex and base of a control feature's elevation range (elevation with respect to the control canvas plane). In some cases, features that have no distinct path, but rather, are predictably located within a given area or “zone” are surrounded by a box-like calibration feature that indicates the allowed feature location area. In some cases, a calibration feature is used that indicates an area that should be free of control features.
In some cases: (a) the control feature travels in periodic movements; (b) the control feature path circumscribes a region; and (c) one or more calibration features demark a center point 912 at the center of the region. In some implementations, the center point is indicated directly by placing a control feature at the center location of the rotating mechanical interface, as is illustrated in
In some implementations, a set of calibration features are placed around a component in the controller, in order to indicate the position of one or more points in the component, as shown in
In some implementations: (a) a calibration feature spans the entire control feature path, as shown in
The examples in
In some implementations, the control features themselves are used to calculate the control feature path and position. This is advantageous, for example, where: (a) no calibration features are available, or (b) a given mechanical interface does not support calibration features. In some cases (in which the control features are used to calculate the control feature path), the control features are moved into all their possible states while tracing the positions and storing intermediate positions into memory. This calibration method is done in advance by saving the path of each control feature, or during the system operation by using a “learning” algorithm while the user operates the system.
Alternatively, calibration is performed without using calibration features by taking a series of images in succession while the user operates the hardware attachment such that all possible positions, paths, and areas of the given set of control features are reached. A computer combines pixel values of each image frame using a non-maximum-suppression technique and outputs a composite image of the feature position space. This composite image maps out areas in which features are expected to be present and areas in which features are expected to be absent during normal operation.
In a separate implementation, a computer uses calibration features to determine positions of the control canvas, light source, light detector, and display unit relative to each other (e.g. perpendicular distance between control canvas plane and display unit plane in millimeters). In some cases, a computer calculates these positions by detecting calibration features with known absolute displacements, and then combining this information with known MCD parameters such as the distance between the light detector unit and the display unit.
In some implementations, color segmentation is used to aid system calibration and to improve the signal-to-noise ratio (SNR) of the content within image frames. Color segmentation is implemented by either the light source, the light detector, or both. For example, in some cases, the color range of the light source is selected, such that the SNR of a given control feature is enhanced to spatially filter out the calibration features and background information from the detector signal. In some cases, the color range of light detector data is controlled through color channel filtering.
In some implementations, different areas of the control canvas have different spectral responses to light. For example, for a first color of light, a first region of the control canvas may reflect more light than a second region of the control canvas does, and for a second color, the first region of the control canvas may reflect less light than the second region does.
Similarly, in some cases, the data collected by the light detector is color segmented to achieve visual feature separation. For example, in some cases, a CCD camera in the MCD operates using three distinct color channels (i.e. RGB: red, green, blue), and features that appear in one color channel are segmented from features appearing in one or both of the other channels. In some cases, RGB color channel data is reformulated to other color spaces, such as YUV, CMYK, or L*ab, thereby providing additional options in channel segmentation. For example, in some cases, using the red chroma channel (Cr in YUV color space) significantly enhances features with a red tone, while strongly suppressing features with a blue tone. Color segmentation methods are advantageous in low-light environments with limited light detector sensitivity.
In some implementations, other noise reduction techniques are used to enhance the SNR of control features during system operation.
For example, in some cases: A series of “ground truth” images are captured during system calibration. The ground truth images can be subtracted from frames captured during system operation, which results in composite images that are void of background image content. In some cases, the active light source is turned off when acquiring the ground truth images. This causes the information in the captured frames to be effectively a snapshot of undesirable image noise content under ambient light. A computer treats the noise image as a ground truth and subtracts the noise image from subsequent image frames during system operation. In some cases, noise reduction techniques are used prior to the main system operation time, as a calibration step, or triggered any time during the system operation. For example, in some use scenarios, an optical link between controller and MCD is determined to be unsatisfactory, and a new ground truth snapshot sequence is triggered by briefly turning the light source off, capturing an image frame, and then turning the light source back on.
In some cases: (a) reflection/absorption spectra of visual features are not known in advance; and (b) a color sweep is performed during system calibration by varying the color of the light source. A first color is emitted from the light source, and the response from each visual feature is measured by the light detector. This is repeated for a variety of different colors. A computer compares the color response measurements from each visual feature, and selects the color/detector-sensitivity combinations that favor optimal feature segmentation. In some cases, a computer dynamically adjusts the color range of the light source during system operation, in order to optimize the control feature response in each image frame versus the light detector's sensitivity.
Alternatively, in some cases: (a) reflection/absorption spectra of visual features are not known in advance; and (b) a color sweep is performed during system calibration by using a constant light source color, but computationally altering the color response of the light detector by sweeping through color channels, color spaces, and hue levels in each image frame.
The first example is onboard an MCD. In
The second example is in memory for a server computer. In
The third example is in a master copy. In
In
In
In
In
In
In
In
In
In
In
In
In
In
In
In
In
In
In
In
In
In some implementations of this invention, relay optics increase, decrease or shift a camera's field of view, and thereby (a) increase spatial resolution and (b) center the control components in a captured image. The increased spatial resolution facilitates optical tracking of visual features (e.g., 420) of moving control components (e.g., 406, 415, 419, 423) and increases the range (depth) of such optical tracking.
In the example shown in
In the example shown in
The controller device 1760 includes I/O devices 1761, such as a dial 217, button 215, or slider 216. A human user presses against, or otherwise applies force to, the I/O devices 1761, in order to mechanically move the I/O devices 1761. The movement of the I/O devices 1761 is, in turn, mechanically transferred to control components 310, causing the control components 310 to move also.
The movement of the control components 310 is used to control operation of the MCD 1720 or apparatus onboard the controller 1760, as follows: One or more light sources onboard the MCD 1720 (e.g. a display screen 1721, LED 1723 or flash 1725) illuminate the moving control components 310. A camera 1727 onboard the MCD 1720 captures images of the moving control components 310 and of visual features attached to the moving control components 310. One or more computers 1729 onboard the MCD 1720 process the images and output control signals. In some cases, the control signals control operation of the MCD 1720, such as by controlling a visual display on a screen 1721 of the MCD 1720. In some cases, the control signals are sent to the controller device 1760 via a wired communication link 1772 or via a wireless communication link 1774. The wireless communication link 1774 is between wireless communication module 1726 (which is onboard the MCD 1720) and wireless communication module 1776 (which is onboard the controller device 1760). The control signals control operation of one or more devices onboard the controller device 1760, such as (a) a variable lens system 1762, apparatus for objective refractive measurements 1764, relaxation apparatus 1765, imaging apparatus 1766, concentric rings 1767 or tonometer 1768.
Alternatively or in addition, in some cases, at least some of the I/O devices 1761 are operatively connected to a transducer module 1730 onboard the controller device 1760. The transducer module 1730 converts mechanical motion into electrical energy. For example, in some cases, the mechanical motion is imparted by a human user manipulating at least some of the I/O devices 1761. The transducer module 1730 includes: (a) a transducer 1731 for transforming mechanical movement into analog electrical current or voltage; and (b) an ADC (analog to digital converter) 1732 for converting the analog electrical current or voltage into digital data. The digital data in turn controls the operation of devices onboard the controller device 1760, such as (a) a variable lens system 1762, apparatus for objective refractive measurements 1764, relaxation apparatus 1765, imaging apparatus 1766, concentric rings 1767 or tonometer 1768.
The controller device 1760 includes a variable lens system (VLS) 1762. One or more refractive attributes (e.g., spherical power, cylindrical power, cylindrical axis, prism or base) of the VLS 1762 are adjustable. The user holds the device 1760 up to his or her eyes, and looks through the device 1760 (including through the VLS 1762) at screen 1721 of MCD 1720. Iterative vision tests are performed, in which refractive properties of the VLS are changed from iteration to iteration. I/O devices 1761 onboard the controller device 1760 receive input from the user regarding which VLS setting results in clearer vision. For example, in some use scenarios, if spherical power is being optimized during a particular step of the testing procedure, the user inputs feedback regarding whether a test image appears clearer with the current VLS setting (a changed spherical power) than with the last VLS setting (a prior spherical power).
In the example shown in
During the iterative vision testing, a mobile computing device (MCD) 1720 is attached to the front of the controller device 1760 (i.e., to a side of the device 1760 opposite the user's eyes). During the test, the scene a user sees (when looking through the controller device 1760) is an image displayed on a screen 1761 of the MCD 1720. For example, in some cases, the MCD 1720 comprises a smartphone or cell phone, and the user views all or portions of the phone's display screen when looking through the controller device 1760.
After the MCD 1720 is attached to the front of the controller device 1760, the user looks through the controller device 1760. Specifically, the user holds a viewport or eyeports of the controller device 1760 at eye level, and looks through the controller device 1760 to see the MCD screen 1721. The user sees light that travels through the controller device 1760: light travels from the MCD screen, then through the variable lens system (1762) of the controller device 1760, then through a viewport or eyeholes of the controller device 1760, and then to the eyes. The MCD 1720 is attached on one side of the device 1760; the viewport or eyeholes are on an opposite side of the device 1760.
During at least part of the iterative vision test, the MCD screen displays one or more visual patterns that are used in the test.
In illustrative implementations, the user gives feedback regarding which setting of the variable lens system (VLS) 1762 produces the clearest vision for the user. For example, in some use scenarios: (a) in a first trial, a VLS refractive attribute (e.g., spherical power, cylindrical power, cylindrical axis, prism or base) is set to a first value while the user looks through the controller device 1760 at a test image displayed on the MCD screen; (b) in a second trial, the VLS refractive attribute is set to a second value while the user looks through the controller device 1760 at the same test image on the MCD screen; and (c) an I/O device 1761 accepts input from the user regarding whether the image in the second trial looks clearer or less clear than in the first trial. The format of the input may vary. For example, in some cases, the user simply indicates which image he or she prefers, and this input regarding preference is a proxy for which image appears clearer to the user.
The VLS 1762 comprises one or more lenses and, in some cases, one or more actuators. One or more refractive attributes (e.g., spherical power, cylindrical power, cylindrical axis, prism or base) of the VLS 1762 are programmable and controllable. The VLS 1762 may be implemented in many different ways. For example, in illustrative implementations, the VLS 1762 includes one or more of the following: an Alvarez lens pair, Jackson cross-cylinders, Humphrey lenses, a sphero-cylindrical lens pair, Risley prisms, or liquid lenses.
In the example shown in
In some cases, the additional apparatus 1763 includes apparatus for taking objective refractive measurements 1764 (i.e., measurements that do not involve feedback regarding the user's subjective visual perception). An iterative testing procedure that involves feedback regarding the user's subjective visual perceptions is performed. In some cases, the objective measurement apparatus 1764 takes measurements during each iteration of an iterative vision test. Alternatively or in addition, the variable lens system 1762 is used to improve measurements taken by the objective measurement apparatus, by optimizing focusing into the retina. In some implementations, the apparatus for objective refractive measurement 1764 comprises one or more of the following: (1) an auto-refractor, which automates a Scheiner's test with a lens and fundus camera to assess the image quality of a known source falling into the retina; (2) a Shack-Hartmann device for wavefront sensing, which analyzes the distortions of a known light pattern reflected onto a human retina and creates a wavefront map; or (3) a retroillumination system, which captures images of an eye structure while illuminating the eye structure from the rear (e.g., by reflected light).
In some cases, the additional apparatus 1763 includes relaxation apparatus 1765. The relaxation apparatus 1765 presents stimuli to either an eye being tested, the other eye, or both eyes. The stimuli tend to control the accommodation (and thus the optical power) of the user's eyes. In some cases, the relaxation apparatus includes a combination of one or more of the following (a) a lens or group of lenses, (b) actuators for moving the lens or lenses, (c) masks or other spatial light attenuators, (d) mirrors, optical fibers or other relay optics for steering light, and (e) a display screen or film for displaying images.
In some cases, an iterative vision test is performed to measure refractive aberrations (e.g., myopia, hyperopia, prism, astigmatism, spherical aberration, coma or trefoil) of the eyes of a human user. The test is performed while the controller device 1760 is positioned in front of the user's eyes. The test involves the use of one or more of the VLS 1762, apparatus for objective refractive assessment 1764 and the relaxation apparatus 1765. In some cases, the iterative vision test involves displaying images on a screen 1721 of an MDS 1720 that is releasably attached to the controller device 1760. For examples, in some cases the iterative eye test is performed in the manner described in the NETRA Patent, and the images that are displayed on an MCD screen include images that are described in the NETRA Patent. As used herein, the “NETRA Patent” means U.S. Pat. No. 87,817,871 B2, Near Eye Tool for Refractive Assessment, Vitor Pamplona et al. The NETRA Patent is incorporated herein by reference.
In some cases, a computer (e.g., onboard the MCD) analyzes data gathered during the iterative eye test and calculates refractive aberration data—that is, data indicative of one or more refractive aberrations of the eyes of a human user. The computer takes this refractive aberration data as input and outputs control signals to control one or more devices in order to compensate for the refractive aberrations. For example, in some cases, the control signals control the VLS 1762 such that the VLS 1762 corrects (compensates for) the refractive aberrations indicated by the refractive aberration data. Or, in some cases, the control signals cause visual images displayed by a screen of the MCD to be distorted in such a way as to compensate for the refractive aberrations indicated by the refractive aberration data. This distortion of images displayed by the MCD screen is sometimes called warping or pre-warping.
In some cases, the refractive aberrations are corrected (e.g., by controlling the VLS or distorting the MCD images) while a user watches visual content displayed by the MCD 1730, such as a photograph, interactive game, or virtual reality display. Thus, the user sees the visual content with corrected vision, without the need for eyeglasses or contacts.
In some cases, the refractive aberrations are corrected (e.g., by controlling the VLS or distorting the MCD images) while a user watches an augmented reality display. In the augmented reality display, images are displayed on an optical element (e.g., a half-silvered surface) that both reflects and transmits light, so that the user sees not only the augmented reality display that reflects from the optical element but also sees the light from an external scene that passes through the optical element.
This ability to detect and correct (compensate for) refractive aberrations of the human eye, without using conventional eyeglasses, is advantageous, including in virtual reality and augmented reality applications.
As noted above, the controller device 1760 is not always handheld. In some cases, the controller device is worn on the head or otherwise head-mounted. For applications in which the user is watching a long movie, or an interactive game, or a prolonged virtual reality display, it is sometimes advantageous for the controller device 1760 to be head-mounted or otherwise worn on the head, and for supplemental I/O devices that are not housed in the MCD 1720 or handheld device 1760 to be also used. For example, in some cases, the supplemental I/O devices include wireless communication modules for communicating with the MCD 1720.
In some cases, the additional apparatus 1763 onboard the controller device 1760 includes imaging apparatus 1766. The imaging apparatus 1776 includes one or more cameras and lenses. In some cases, the imaging apparatus 1766 images the retina or other parts or structures of a human eye. In some cases, the imaging apparatus 1766 is used to detect conditions of the human eye, including cataracts, retinal detachment or strabismus. In some cases, the imaging apparatus 1766 is used to measure inter-ocular distance or the orientation of the eye.
In some cases, the additional apparatus 1763 includes a set of concentric rings 1767 around each eyeport (e.g., 1108, 1109). In some cases, the concentric rings 1767 comprise active light sources, such as LEDs (light emitting diodes). In other cases, the concentric rings 1767 comprise reflective surfaces that are illuminated by light sources (such as an LED, display screen or flash) onboard the MCD 1720.
In some implementations, corneal topography is measured as follows: Concentric rings 1767 are actively illuminated (if they are active light sources, such as LEDs) or passively illuminated (if they are passive light sources, such as reflective surfaces). Light from the rings 1767 reflects off of the anterior surface of the cornea of an eye. The imaging apparatus 1766 onboard the controller device 1760 (or camera 1727 onboard the MCD 1720) captures images of the reflected light. A computer (e.g., onboard MCD 1720) analyzes these images in order to map the surface curvature of the cornea.
In some cases, the additional apparatus 1763 includes a tonometer 1768 that measures intraocular pressure of eyes of a human user. For example, in some cases, the tonometer 1768 comprises an applanation tonometer (which measures force needed to flatten an area of the cornea), such as a Goldmann tonometer or Perkins tonometer. In some cases, the tonometer 1768 comprises a dynamic contour tonometer. In some cases, the tonometer 1768 performs non-contact (e.g., air-puff) tonometry measurements.
In some cases, controller device 1760 includes relay optics 1769. The relay optics 1769 increase, decrease or shift a camera's field of view, and thereby (a) increase spatial resolution and (b) center the control components in a captured image. The increased spatial resolution facilitates optical tracking of visual features (e.g., 420) of moving control components (e.g., 406, 415, 419, 423) and increases the range (depth) of such optical tracking.
In illustrative implementations, one or more electronic computers (e.g. 622, 1312, 1729) are programmed and specially adapted: (1) to control the operation of, or interface with, hardware components of a mobile computing device (MCD), including one or more cameras, light sources (including flashes and LEDs), screens (including display screens or capacitive touch screens), graphical user interfaces, I/O devices and wireless communication modules; (2) to control the operation of, or interface with, hardware components of a controller device, including a variable lens system, apparatus for objective refractive measurements, imaging apparatus, light sources (e.g., an array of LEDs that form concentric rings), or tonometer; (3) to analyze frames captured by the camera to detect motion of visual features, to map the motion to control signals, and to generate the control signals to control one or more operations of the MCD, including altering a display of a graphical user interface; (4) to perform any other calculation, computation, program, algorithm, computer function or computer task described or implied above; (5) to receive signals indicative of human input; (6) to output signals for controlling transducers for outputting information in human perceivable format; and (7) to process data, to perform computations, to execute any algorithm or software, and to control the read or write of data to and from memory devices. In illustrative implementations, the one or more computers are onboard the MCD. Alternatively, at least one of the computers is remote from the MCD. The one or more computers are connected to each other or to other devices either: (a) wirelessly, (b) by wired connection, or (c) by a combination of wired and wireless links
In illustrative implementations, one or more computers are programmed to perform any and all calculations, computations, programs, algorithms, computer functions and computer tasks described or implied above. For example, in some cases: (a) a machine-accessible medium has instructions encoded thereon that specify steps in a software program; and (b) the computer accesses the instructions encoded on the machine-accessible medium, in order to determine steps to execute in the program. In illustrative implementations, the machine-accessible medium comprises a tangible non-transitory medium. In some cases, the machine-accessible medium comprises (a) a memory unit or (b) an auxiliary memory storage device. For example, in some cases, a control unit in a computer fetches the instructions from memory.
In illustrative implementations, one or more computers execute programs according to instructions encoded in one or more tangible, non-transitory, machine-readable media. For example, in some cases, these instructions comprise instructions for a computer to perform any calculation, computation, program, algorithm, computer function or computer task described or implied above. For example, in some cases, instructions encoded in a tangible, non-transitory, computer-accessible medium comprise instructions for a computer to: (1) to control the operation of, or interface with, hardware components of a mobile computing device (MCD), including one or more cameras, light sources (including flashes and LEDs), screens (including display screens or capacitive touch screens), graphical user interfaces, I/O devices and wireless communication modules; (2) to control the operation of, or interface with, hardware components of a controller device, including a variable lens system, apparatus for objective refractive measurements, imaging apparatus, light sources (e.g., an array of LEDs that form concentric rings), or tonometer; (3) to analyze frames captured by the camera to detect motion of visual features, to map the motion to control signals, and to generate the control signals to control one or more operations of the MCD, including altering a display of a graphical user interface; (4) to perform any other calculation, computation, program, algorithm, computer function or computer task described or implied above; (5) to receive signals indicative of human input; (6) to output signals for controlling transducers for outputting information in human perceivable format; and (7) to process data, to perform computations, to execute any algorithm or software, and to control the read or write of data to and from memory devices.
In illustrative implementations of this invention, a mobile computing device (MCD) includes a wireless communication module for wireless communication with other electronic devices in a network. The wireless communication module (e.g., module 626, 1726, 1776) includes (a) one or more antennas, (b) one or more wireless transceivers, transmitters or receivers, and (c) signal processing circuitry. The wireless communication module receives and transmits data in accordance with one or more wireless standards.
In illustrative implementations, one or more computers onboard the MCD are programmed for wireless communication over a network. For example, in some cases, one or more computers are programmed for network communication: (a) in accordance with the Internet Protocol Suite, or (b) in accordance with any industry standard for wireless communication, including IEEE 802.11 (wi-fi), IEEE 802.15 (bluetoothhigbee), IEEE 802.16, IEEE 802.20 and including any mobile phone standard, including GSM (global system for mobile communications), UMTS (universal mobile telecommunication system), CDMA (code division multiple access, including IS-95, IS-2000, and WCDMA), or LTS (long term evolution).
The terms “a” and “an”, when modifying a noun, do not imply that only one of the noun exists.
To compute “based on” specified data means to perform a computation that takes the specified data as an input.
Here are some non-limiting examples of a “camera”: (a) a video camera; (b) a digital camera; (c) a sensor that records images; (d) a light sensor; (e) apparatus that includes a light sensor or an array of light sensors; and (f) apparatus for gathering data about light incident on the apparatus. The term “camera” includes any computers that process data captured by the camera.
The term “comprise” (and grammatical variations thereof) shall be construed as if followed by “without limitation”. If A comprises B, then A includes B and may include other things.
The term “computer” includes any computational device that performs logical and arithmetic operations. For example, in some cases, a “computer” comprises an electronic computational device, such as an integrated circuit, a microprocessor, a mobile computing device, a laptop computer, a tablet computer, a personal computer, or a mainframe computer. In some cases, a “computer” comprises: (a) a central processing unit, (b) an ALU (arithmetic/logic unit), (c) a memory unit, and (d) a control unit that controls actions of other components of the computer so that encoded steps of a program are executed in a sequence. In some cases, a “computer” also includes peripheral units including an auxiliary memory storage device (e.g., a disk drive or flash memory), or includes signal processing circuitry. However, a human is not a “computer”, as that term is used herein.
A “control canvas” means a set of visual features, in which the presence, position or motion of certain visual features is indicative of a user command or instruction, or is used to control the operation of another device. The term “control canvas” does not imply that a canvas textile is present.
“Controller” means a device that controls one or more hardware features or operations of another device.
“Defined Term” means a term or phrase that is set forth in quotation marks in this Definitions section.
For an event to occur “during” a time period, it is not necessary that the event occur throughout the entire time period. For example, an event that occurs during only a portion of a given time period occurs “during” the given time period.
The term “e.g.” means for example.
The fact that an “example” or multiple examples of something are given does not imply that they are the only instances of that thing. An example (or a group of examples) is merely a non-exhaustive and non-limiting illustration.
“Eyeport” means a hole or opening through which a human eye looks. In some but not all cases, an eyeport surrounds a lens or other optical element, such that light which passes through the eyeport travels through the lens or other optical element.
Unless the context clearly indicates otherwise: (1) a phrase that includes “a first” thing and “a second” thing does not imply an order of the two things (or that there are only two of the things); and (2) such a phrase is simply a way of identifying the two things, respectively, so that they each can be referred to later with specificity (e.g., by referring to “the first” thing and “the second” thing later). For example, unless the context clearly indicates otherwise, if an equation has a first term and a second term, then the equation may (or may not) have more than two terms, and the first term may occur before or after the second term in the equation. A phrase that includes a “third” thing, a “fourth” thing and so on shall be construed in like manner.
The term “for instance” means for example.
As used herein, the “forehead” means the region of a human face that covers the frontal bone, including the supraorbital ridges.
“Frontal bone” means the os frontale.
“Herein” means in this document, including text, specification, claims, abstract, and drawings.
As used herein: (1) “implementation” means an implementation of this invention; (2) “embodiment” means an embodiment of this invention; (3) “case” means an implementation of this invention; and (4) “use scenario” means a use scenario of this invention.
The term “include” (and grammatical variations thereof) shall be construed as if followed by “without limitation”.
“Intensity” means any measure of or related to intensity, energy or power. For example, the “intensity” of light includes any of the following measures: irradiance, spectral irradiance, radiant energy, radiant flux, spectral power, radiant intensity, spectral intensity, radiance, spectral radiance, radiant exitance, radiant emittance, spectral radiant exitance, spectral radiant emittance, radiosity, radiant exposure or radiant energy density.
“I/O device” means an input/output device. For example, an I/O device includes any device for (a) receiving input from a human, (b) providing output to a human, or (c) both. For example, an I/O device includes a graphical user interface, keyboard, mouse, touch screen, microphone, handheld controller, display screen, speaker, or projector for projecting a visual display. Also, for example, an I/O device includes any device (e.g., button, dial, knob, slider or haptic transducer) for receiving input from, or providing output to, a human.
“Light” means electromagnetic radiation of any frequency. For example, “light” includes, among other things, visible light and infrared light. Likewise, any term that directly or indirectly relates to light (e.g., “imaging ”) shall be construed broadly as applying to electromagnetic radiation of any frequency.
“Metallics” means metallic surfaces or surfaces that are covered with metallic paint,
The term “mobile computing device” or “MCD” means a device that includes a computer, a camera, a display screen and a wireless transceiver. Non-limiting examples of an MCD include a smartphone, cell phone, mobile phone, phablet, tablet computer, laptop computer and notebook computer.
To “multiply” includes to multiply by an inverse. Thus, to “multiply” includes to divide.
The term “or” is inclusive, not exclusive. For example A or B is true if A is true, or B is true, or both A or B are true. Also, for example, a calculation of A or B means a calculation of A, or a calculation of B, or a calculation of A and B.
A parenthesis is simply to make text easier to read, by indicating a grouping of words. A parenthesis does not mean that the parenthetical material is optional or can be ignored.
The term “refractive aberration” means an optical aberration, of any order, of a refractive optical element such as a human eye. Non-limiting examples of “refractive aberration” of a human eye include myopia, hyperopia, prism (or tilt), astigmatism, secondary astigmatism, spherical aberration, coma, trefoil, and quadrafoil.
As used herein, a “set” must have at least two elements. The term “set” does not include a group with no elements and does not include a group with only one element. Mentioning a first set and a second set does not, in and of itself, create any implication regarding whether or not the first and second sets overlap (that is, intersect).
“Some” means one or more.
“Substantially” means at least ten percent. For example: (a) 112 is substantially larger than 100; and (b) 108 is not substantially larger than 100.
The term “such as” means for example.
To say that a medium has instructions encoded “thereon” means that the instructions are encoded on or in the medium.
“User interface” means an I/O device, as defined herein.
“Variable lens system” means a system of one or more lenses, the optical power of which system is adjustable.
Except to the extent that the context clearly requires otherwise, if steps in a method are described herein, then the method includes variations in which: (1) steps in the method occur in any order or sequence, including any order or sequence different than that described; (2) any step or steps in the method occurs more than once; (3) different steps, out of the steps in the method, occur a different number of times during the method, (4) any combination of steps in the method is done in parallel or serially; (5) any step or steps in the method is performed iteratively; (6) a given step in the method is applied to the same thing each time that the given step occurs or is applied to different things each time that the given step occurs; or (7) the method includes other steps, in addition to the steps described.
This Definitions section shall, in all cases, control over and override any other definition of the Defined Terms. For example, the definitions of Defined Terms set forth in this Definitions section override common usage or any external dictionary. If a given term is explicitly or implicitly defined in this document, then that definition shall be controlling, and shall override any definition of the given term arising from any source (e.g., a dictionary or common usage) that is external to this document. If this document provides clarification regarding the meaning of a particular term, then that clarification shall, to the extent applicable, override any definition of the given term arising from any source (e.g., a dictionary or common usage) that is external to this document. To the extent that any term or phrase is defined or clarified herein, such definition or clarification applies to any grammatical variation of such term or phrase, taking into account the difference in grammatical form. For example, the grammatical variations include noun, verb, participle, adjective, and possessive forms, and different declensions, and different tenses. In each case described in this paragraph, Applicant is acting as Applicant's own lexicographer.
This invention may be implemented in many different ways. Here are some non-limiting examples:
In one aspect, this invention is a method comprising, in combination: (a) a first component of an apparatus undergoing a first movement relative to housing of the apparatus, while a surface of the apparatus is pressed against the forehead and cheeks of a human user and the apparatus is attached to a mobile computing device; (b) a first camera onboard the mobile computing device capturing images indicative of the first movement; and (c) a computer onboard the mobile computing device processing the images to recognize the first movement and, based on data indicative of the first movement, generating control signals to control, at least in part, operation of the mobile computing device. In some cases, the control signals control at least part of a display on a screen of the mobile computing device. In some cases, the control signals cause a visual feature displayed on a screen of the mobile computing device to undergo a second movement, which second movement is calculated by the computer, such that the second movement is a function of the first movement. In some cases, a second component of the apparatus has one or more visual features that: (a) are in a fixed position relative to the housing; and (b) are indicative of a path of the first movement. In some cases, the visual features are offset at a specified distance from the path. In some cases, the visual features are positioned at the beginning and end of the path, or are offset at a specified distance from the beginning and end of the path. In some cases, the screen displays images used in an assessment of refractive aberrations of an eye of the human user. In some cases, the computer outputs signals to adjust a variable lens system onboard the apparatus, such that the variable lens system compensates for at least one refractive aberration of a user's eyes. In some cases, the variable lens system compensates for at least one refractive aberration of a user's eyes while (i) visual content is displayed on the screen and (ii) light from the screen reaches the eyes of the user. In some cases, the computer outputs signals that cause the screen to display visual content that is warped by a distortion, which distortion at least partially compensates for at least one refractive aberration of an eye of the user. In some cases, the computer generates, based at least in part on data indicative of the first movement, signals that control a tonometer onboard the apparatus, which tonometer measures intraocular pressure of an eye of the user. In some cases, the computer generates, based at least in part on data indicative of the first movement, signals that control a second camera onboard the apparatus, which second camera captures visual data regarding the retina or other structures or parts of an eye of the user. In some cases, the computer processes the visual data and detects a condition or parameter of an eye of the human, which condition or parameter is not a refractive aberration. In some cases, the computer generates, based at least in part on data indicative of the first movement, signals that control a corneal topography device onboard the apparatus, which corneal topography device measures surface curvature of a cornea of an eye of the user. Each of the cases described above in this paragraph is an example of the method described in the first sentence of this paragraph, and is also an example of an embodiment of this invention that is combinable with any other feature or embodiment of this invention.
In another aspect, this invention is a system comprising, in combination: (a) apparatus which (i) includes an external curved surface that is configured to be pressed against the forehead and cheeks of a human user, (ii) includes an attachment mechanism for attaching the apparatus to a mobile computing device, and (iii) includes a first component that is configured to undergo movement relative to housing of the apparatus; and (b) a machine-readable medium having instructions encoded thereon for a computer: (i) to generate control signals that cause a first camera onboard the mobile computing device to capture images indicative of the movement, and (ii) to process the images to recognize the movement and, based on data indicative of the movement, to generate control signals to control at least, at least in part, operation of the mobile computing device. In some cases, the machine-readable medium is tangible and does not comprise a transitory signal. In some cases, the instructions encoded on the machine-readable medium include instructions for a computer to output control signals to cause a screen onboard the mobile computing device to display images used in an assessment of refractive aberrations of an eye of the human user. In some cases, the instructions encoded on the machine-readable medium include instructions for a computer to output control signals to control timing of the first camera and a light source onboard the mobile computing device, such that the emission of light by the light source and capture of images by the camera are synchronized. In some cases: (a) a second component of the apparatus has a fixed position relative to the housing; and (b) the second component has one or more visual features that are indicative of a path of the first movement. In some cases: (a) the images include data regarding a set of components of the apparatus, which set includes the first component; (b) at least some components in the set of components have a different color than the color of other components in the set; and (c) the instructions encoded on the machine-readable medium include instructions for a computer to output control signals to cause a light source onboard the mobile computing device to change, over time, color of light emitted by the light source. In some cases: (a) the images include data regarding a set of components of the apparatus, which set includes the first component; (b) at least some components in the set of components have a different color than the color of other components in the set; and (c) the instructions encoded on the machine-readable medium include instructions for a computer to change, over time, which colors are enhanced or suppressed during processing of images captured by the camera. In some cases, the instructions encoded on the machine-readable medium include instructions for the computer to output signals that cause a screen onboard the mobile computing device to display visual content that is warped by a distortion, which distortion at least partially compensates for at least one refractive aberration of an eye of the user. In some cases, the instructions encoded on the machine-readable medium include instructions for causing a tonometer onboard the apparatus to measure intraocular pressure of an eye of the user. In some cases, the instructions encoded on the machine-readable medium include instructions for causing a second camera onboard the apparatus to capture visual data regarding the retina or other structures or parts of an eye of the user. In some cases, the instructions encoded on the machine-readable medium include instructions for the computer to process the visual data and detect a condition or parameter of an eye of the human, which condition or parameter is not a refractive aberration. In some cases, the instructions encoded on the machine-readable medium include instructions for causing a corneal topography device onboard the apparatus to measure surface curvature of a cornea of an eye of the user. In some cases, the instructions encoded on the machine-readable medium include instructions for the computer to output signals to adjust a variable lens system onboard the apparatus, such that the variable lens system at least partially compensates for at least one refractive aberration of an eye of the user. In some cases, the instructions encoded on the machine-readable medium include instructions for the computer to cause the variable lens system to at least partially compensate for at least one refractive aberration of an eye of the user while (i) visual content is displayed on the screen and (ii) light from the screen reaches the eyes of the user. Each of the cases described above in this paragraph is an example of the system described in the first sentence of this paragraph, and is also an example of an embodiment of this invention that is combinable with any other feature or embodiment of this invention.
In another aspect, this invention comprises apparatus that: (a) includes an attachment mechanism for attaching the apparatus to a mobile computing device; (b) includes a first component that is configured to undergo movement relative to housing of the apparatus; (c) includes an external curved surface that is configured to be pressed against the forehead and cheeks of a human user; and (d) has a hole which extends through the apparatus, such that, when the external curved surface is pressed against the forehead and cheeks and the apparatus is attached to the mobile computing device, a view through the apparatus exists, the view being through the hole to at least a portion of a screen of the mobile computing device. In some cases: (a) a second component of the apparatus is in a fixed position relative to the housing; and (b) the second component has one or more visual features that are indicative of a path of the movement. In some cases, the visual features are offset at a specified distance from the path. In some cases: (a) the first component has a first color and the second component has a second color; and (b) the first color is different than the second color. In some cases, the first component has a specular surface. In some cases, the first component has a surface such that, when incident light from a light source strikes the surface and reflects from the surface, the intensity of light reflected by the first component is greatest in a direction toward the light source. Each of the cases described above in this paragraph is an example of the apparatus described in the first sentence of this paragraph, and is also an example of an embodiment of this invention that is combinable with any other feature or embodiment of this invention.
The above description (including without limitation any attached drawings and figures) describes illustrative implementations of the invention. However, the invention may be implemented in other ways. The methods and apparatus which are described above are merely illustrative applications of the principles of the invention. Other arrangements, methods, modifications, and substitutions by one of ordinary skill in the art are therefore also within the scope of the present invention. Numerous modifications may be made by those skilled in the art without departing from the scope of the invention. Also, this invention includes without limitation each combination and permutation of one or more of the abovementioned implementations, embodiments and features.
This application is a non-provisional of, and claims the priority of the filing date of, U.S. Provisional Patent Application No. 61/970,032, filed Mar. 25, 2014 (the “032 Application”), and of U.S. Provisional Patent Application No. 62/103,062, filed Jan. 13, 2015 (the “062 Application”). The entire disclosures of the 032 Application and the 062 Application are incorporated herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US15/22138 | 3/24/2015 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62103062 | Jan 2015 | US | |
61970032 | Mar 2014 | US |