This application relates to ultrasonic transceivers and utilizing these devices to determine whether to adjust characteristics of a user interface or characteristics of items presented at a user interface.
Different types of user interfaces exist and one type of user interface is a keyboard or keypad. Smart phones and tablets often present the keyboard to a user as part of a touch screen. In these cases, alphanumeric keys are displayed on the touch screen and the user touches the screen about the key they wish to select. Unfortunately, the displays presented on touch screens are sometimes small. If the display is too small, the user has a hard time contacting or striking the correct key or in some cases strikes an incorrect key.
Still another type of interface also utilizes a touch screen but presents icons (instead or in addition to alphanumeric keys) on the interface to a user. For example, this type of interface may be utilized as part of a cellular phone or smart phone. As with the displays involving keyboards, the icons may sometimes be too small for a correct selection to be made. If the display is small, the user has a hard time contacting or striking the intended icon and sometimes contacts the incorrect icon.
When the incorrect key or icon is contacted, the user may have to re-do their work. For example, if the user were typing an email, they may have to re-type portions of the message or even start over. If a user selects the incorrect icon, an unintended application may launch wasting both user and system resources.
Previous approaches have not adequately addressed these problems. As a result, some user dissatisfaction with these previous approaches has occurred.
For a more complete understanding of the disclosure, reference should be made to the following detailed description and accompanying drawings wherein:
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity. It will further be appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. It will also be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein.
The present approaches provide an adjustable interface whereby characteristics (e.g., the size) of displayable graphic units or graphic display units (e.g., alphanumeric keys or icons) are adjusted as a feature of interest (e.g., a finger) approaches a displayable graphic unit. These approaches utilize one or more ultrasonic transceivers that transmit an ultrasonic signal (and receive a reflected ultrasonic signal in return). The received ultrasonic signal(s) are used to identify a graphic display unit (e.g., key) and determine whether the feature of interest (e.g., the finger) is within a predetermined distance (e.g., height) of the graphic display unit so that characteristics of the graphical display unit can be altered.
Referring now to
The user interface 102 is any type of user display that presents information to a user. In one example, the user interface is a touch screen that as described elsewhere herein is divided into bins (i.e., a grid pattern). Graphical display units (e.g., alphanumeric keys or icons) are presented to the user on the user interface. Characteristics such as the length, width, color, intensity, and resolution of the graphical display units may be changed. The graphical display units are the collection of pixels that form an image. For example, the pixels may form an image of the letter “A” or an icon representing a website. Other examples are possible.
The first ultrasonic transceiver 104, second ultrasonic transceiver 106, third ultrasonic transceiver 108, and fourth ultrasonic transceiver 110 transmit ultrasonic signals and receive reflected ultrasonic signals back. As used herein, “ultrasonic” means signals in the 20-200 KHz frequency range. The transceivers 104, 106, 108, and 110 also convert the returned signal into the appropriate format that can be processed by a digital signal processing device. For example, the transceivers 104, 106, 108, and 110 convert the received signals into distance information in an appropriate format for a digital processing device (e.g., the processor 112).
In these regards, the transceivers 104, 106, 108, and 110 measure signal path times and object detection times for features approaching the user interface. As used herein, signal path time is the time from which the signal is generated at the transceiver, propagates at the speed of sound to the reflective feature (e.g., a finger), travels back from the reflective feature to the transceiver (at the speed of sound) and is sensed by the transceiver. In other words, this is the total time a signal takes to go from the transceiver to the reflective feature and then back to the transceiver. The object detection time is one-half the signal path time.
The processor 112 receives information from the transceivers 104, 106, 108, and 110 (which indicates potentially that a feature of interest is approaching the user interface 102) and maps this information to a particular bin (an area as described below) on the display. The identified bin then maps to a particular graphic display unit (e.g., key on a keyboard or icon). When the feature (e.g., the finger) approaching the graphical display unit is a predetermined distance from the user interface 102, a command is sent to the display controller 114 to alter a characteristic of the visual item (e.g., increase the size of a key or icon).
The display controller 114 is configured to drive the user interface 102. For example, the display controller 114 receives information from the processor telling it how to adjust the screen and then has appropriate hardware/software to make the changes to the user interface 102. In one example, the user interface 102 is a touch screen with keys (as the graphic display units) and the display controller 114 increases the size (e.g., doubles or triples) of a particular key that is identified in the command from the processor 112.
Referring now to
Referring now to
At step 304, the feature location is determined from the ultrasonic detection times that have been received from one or more ultrasonic transceivers. Various approaches may be utilized to accomplish this functionality and one such example is described elsewhere herein.
At step 306, the height of the feature is determined from the ultrasonic detection times. For example, the height of the finger that is approaching the user interface (the height being the distance between the finger and the interface) is determined.
At step 308, it is determined if the height is below a predetermined threshold. If the answer is negative, the system does nothing (e.g., no alteration to the user interface is made) at step 310. If the answer is affirmative, at step 312, characteristics of the feature are adjusted.
Referring now to
At step 402, objection detection times are received from the first ultrasonic transceiver 452 and the second ultrasonic transceiver 454. At step 404, the times define circles 422 and 424 on the display that intersect at points 432 and 434 and these points are determined at this step. These points also identify a vertical bin 433.
At step 406, objection detection times are received from the third ultrasonic transceiver 456 and the fourth ultrasonic transceiver 458. At step 408, the times define circles 426 and 428 on the display that intersect at points 436 and 438 and these points are determined at this step. These points also identify a vertical bin 435.
At step 410, the common bin (the intersection of bin 433 and bin 435) is determined.
At step 412, the common bin is mapped to a visual item (the graphical display unit) associated with the bin (e.g., in this example, bin 437 may be mapped to a key or icon).
At step 414, the identified graphical display unit is returned to the main calling approach, for example, as the result of step 304 of
Referring now to
At step 502, objection detection times are received from the first ultrasonic transceiver 552, the second ultrasonic transceiver 554, and the third ultrasonic transceiver 556. At step 504, the times are used to define three circles 522, 524, and 526 that intersect at point 532 and determined at step 506. At step 508, this point 532 also identifies a unique bin 533.
At step 510, the bin 533 is mapped to a visual item (the graphical display unit) associated with the bin (e.g., in this example, bin 533 may be mapped to a key or icon).
At step 512, the identified graphical display unit is returned to the main calling approach, for example, as the result of step 304 of
Referring now to
At step 602, the object detection times for all transceivers are taken. At step 604, the intersection of these times is determined. In these regards, it will be appreciated that the times define three-dimensional spheres. When four sensors are used, there will be a unique intersection of four spheres (each sphere having a radius equal to the object detection time as measured at a particular sensor). The intersection will be a point and this point can be determined be various mathematic approaches as known to those skilled in the art.
At step 606, the height of the feature can be determined, for example, by knowing the coordinates of the plane representing the user interface and by knowing the point of intersection determined at step 604, a distance there between can be determined using appropriate mathematical techniques known to those skilled in the art. At step 608, the identified graphical display unit is returned to the main calling approach, for example, as the result of step 306 of
Preferred embodiments of this invention are described herein, including the best mode known to the inventors for carrying out the invention. It should be understood that the illustrated embodiments are exemplary only, and should not be taken as limiting the scope of the invention.
This application claims the benefit of and priority to U.S. Provisional Application No. 62/139,099, filed Mar. 27, 2015, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62139099 | Mar 2015 | US |