The present disclosure is related generally to electronic device user-interface presentation and manipulation and, more particularly, relates to a system and method for adjusting user-interface characteristics based on a proximate user gesture.
Portable communication, entertainment, and computing devices such as cellular telephones, tablet computers, and so on have existed for quite some time, yet their capabilities continue to expand to this day. More efficient use of the wireless spectrum and the continued miniaturization of electronic components have yielded hand-held devices that can act as stand-alone computers, network nodes, personal digital assistants, and telephones.
There was a period in mobile-device development history when device miniaturization was a paramount consideration. However, as device capabilities expanded, ease of use began to eclipse miniaturization as a primary concern. Today, for example, many mobile devices have significantly more screen area than their progenitors. Indeed, some devices, often referred to as “tablet computers” or simply “tablets,” provide a screen area comparable to that of a small laptop computer.
However, while increased screen area has made it easier for users to interface with a device's full capability, such devices are still mobile devices and are often manipulated with only one hand. This may occur, for example, when a user is holding the mobile device in one hand while holding another object in the other hand.
The discussion of any problem or solution in this Background section simply represents an observation of the inventors and is not to be taken as an indication that the problem or solution represents known prior art. The present disclosure is directed to a method and system that exhibit one or more distinctions over prior systems. However, it should be appreciated that any such distinction is not a limitation on the scope of the disclosed principles or of the attached claims except to the extent expressly noted in the claims.
While the appended claims set forth the features of the present structures and techniques with particularity, these features, together with their objects and advantages, may be best understood from the following detailed description taken in conjunction with the accompanying drawings of which:
The following description is based on embodiments of the claims and should not be taken as limiting the claims with regard to alternative embodiments that are not explicitly described herein. As used herein, the term “mobile electronic device” refers to a portable device having a screen usable to receive user input used at least in part to provide telecommunications services or notifications to a user.
As noted above, when a user holds and interfaces with a mobile electronic device with a single hand, the area of the screen that the user can reach is generally reduced to an area reachable as the user pivots a finger or thumb. Although some mobile devices have a limited ability to manipulate the size and location of input elements (e.g., calculator keypad, phone keypad), this approach only enables the manipulation of device keyboards and does not enable general application and system use. It is also difficult to enable or disable the altered mode in such systems. Moreover, such systems typically require the user to physically tap the display.
In an embodiment, the device display screen is a capacitive touch screen having the ability distinguish between a touch event and a hover event. In this embodiment, hover events are intercepted and are used to activate gesture control for display scaling and panning to enable device access using one-handed navigation. In particular, hovering a digit (finger or thumb) over the screen activates a “resize” mode that temporarily shrinks the display image and moves it closer to the digit to make it more accessible.
In another aspect, a hover event is intercepted and used to trigger a zoom and pan mode, e.g., for users with poor eyesight or for when viewing small content. In both cases described, the interception and use of the hover event do not interfere with the underlying operation of running applications or require any participation from applications. In this way, a large phone display (e.g., a display that is about 5 inches or larger in diagonal measurement) can be made more accessible when only one hand of the user is available.
An exemplary device within which aspects of the present disclosure may be implemented is shown schematically in
The device can also include a component interface 112 to provide a direct connection to auxiliary components or accessories for additional or enhanced functionality and a power supply 114, such as a battery, for providing power to the device components. All or some of the internal components may be coupled to each other, and may be in communication with one another, by way of one or more internal communication links 132, such as an internal bus.
The memory 106 can encompass one or more memory devices of any of a variety of forms, such as read-only memory, random-access memory, static random-access memory, dynamic random-access memory, etc., and may be used by the processor 104 to store and retrieve data. The data that are stored by the memory 106 can include one or more operating systems or applications as well informational data. Each operating system is implemented via executable instructions stored in a storage medium in the device that control basic functions of the electronic device, such as interaction among the various internal components, communication with external devices via the wireless transceivers 102 or the component interface 112, and storage and retrieval of applications and data to and from the memory 106.
With respect to programs, sometimes also referred to as applications, each program is implemented via executable code that utilizes the operating system to provide more specific functionality, such as file-system service and handling of protected and unprotected data stored in the memory 106. Although many such programs govern standard or required functionality of the small touch-screen device, in many cases the programs include applications governing optional or specialized functionality, which can be provided in some cases by third-party vendors unrelated to the device manufacturer.
Finally, with respect to informational data, this non-executable code or information can be referenced, manipulated, or written by an operating system or program for performing functions of the device. Such informational data can include, for example, data that are preprogrammed into the device during manufacture or any of a variety of types of information that are uploaded to, downloaded from, or otherwise accessed at servers or other devices with which the device is in communication during its ongoing operation.
The device can be programmed such that the processor 104 and memory 106 interact with the other components of the device to perform a variety of functions, including interaction with the touch-detecting surface to receive signals indicative of gestures therefrom, evaluation of these signals to identify various gestures, and control of the device in the manners described below. The processor 104 may include various modules and may execute programs for initiating different activities such as launching an application, transferring data, and toggling through various graphical user-interface objects (e.g., toggling through various icons that are linked to executable applications).
The wireless transceivers 102 can include, for example as shown, both a cellular transceiver 103 and a wireless local area network transceiver 105. Each of the wireless transceivers 102 utilizes a wireless technology for communication, such as cellular-based communication technologies including analog communications, digital communications, next generation communications or variants thereof, peer-to-peer or ad hoc communication technologies, or other wireless communication technologies.
Exemplary operation of the wireless transceivers 102 in conjunction with other internal components of the device can take a variety of forms and can include, for example, operation in which, upon reception of wireless signals, the internal components detect communication signals, and one of the transceivers 102 demodulates the communication signals to recover incoming information, such as voice or data, transmitted by the wireless signals. After receiving the incoming information from the transceivers 102, the processor 104 formats the incoming information for the output components 108. Likewise, for transmission of wireless signals, the processor 104 formats outgoing information, which may or may no be activated by the input components 110, and conveys the outgoing information to one or more of the wireless transceivers 102 for modulation as communication signals. The wireless transceivers 102 convey the modulated signals to a remote device, such as a cell tower or an access point (not shown).
The output components 108 can include a variety of visual, audio, and mechanical outputs. For example, the output components 108 can include one or more visual-output components 116 such as a display screen. One or more audio-output components 118 can include a speaker, alarm, or buzzer, and one or more mechanical-output components 120 can include a vibrating mechanism, for example. Similarly, the input components 110 can include one or more visual-input components 122 such as an optical sensor of a camera, one or more audio-input components 124 such as a microphone, and one or more mechanical-input components 126 such as a touch-detecting surface and a keypad.
The sensors 128 can include both proximity sensors 129 and other sensors 131, such as an accelerometer, a gyroscope, any haptic, light, temperature, biological, chemical, or humidity sensor, or any other sensor that can provide pertinent information, such as to identify a current location of the device.
Actions that can actuate one or more input components 110 can include, for example, powering on, opening, unlocking, moving, or operating the device. For example, upon power on, a “home screen” with a predetermined set of application icons can be displayed on the touch screen.
As noted above, in an aspect of the disclosure the mobile electronic device is configured to receive and interpret a hover event in order to modify the user interface of the device.
As shown in screen 200 of
Once in the hover-zoom mode, the device is configured to interpret the hover distance to determine the desired level of zoom or scale. In an embodiment, a greater distance between the screen and the user's digit is interpreted as a request for a smaller scale (e.g., up to the point that the display is of its original scale), while a smaller distance is interpreted as a request for a larger scale. It will be appreciated that the exact relationship between hover distance and scale is not important, and that, for example, closer distances may instead represent a request for a smaller scale while greater distances may instead represent a request for a larger scale.
In an embodiment, changes in hover distance rather than the magnitude of the distance are used to select a desired scale. For example, in this aspect, if the hover distance used by the user to trigger the hover-zoom mode is one centimeter, then the screen display may be scaled at 100% when the hover-zoom mode is entered. Subsequent decreases in the hover distance may then be used as described above to increase the scale of the display, and from there, increasing the distance again will result in a reduction of the display scale, e.g., back to 100%.
As shown in screen 202 of
As shown in
In another embodiment, the mobile device is configured such that a drag action shifts the viewport itself rather than shifting the underlying material. In this embodiment, a leftward drag action would actually pan the point of view, or viewport, to the left, much like panning a camera.
When a user has completed using the hover-zoom mode, he may exit the mode by gesture as well. For example, in an embodiment, the mobile electronic device is configured to interpret one or more actions as a request to exit the mode. In one aspect, if the user lifts the digit out of screen-detection range, then the device is configured to exit the hover-zoom mode. Similarly, if the user touches and then releases the screen, then this may also serve as a request to exit the hover-zoom mode.
In a further embodiment, the mobile electronic device is configured to provide a hover-triggered resize mode. In this aspect, if the user hovers a digit over a spot on the screen for two seconds (or other predetermined time period), then the device will enter the hover-resize mode. In this mode, the device relocates and resizes the displayed material such that the material is visually concentrated closer to the user's digit.
In an aspect of this embodiment, the x-coordinate of the hover point is determined. If the hover x-coordinate is greater than half the screen width from the right, then the user is assumed to be left-handed, and a “resizing-rectangle” is overlaid from the bottom left-hand corner of the display to the hover location, showing the location to which the screen will be resized. If the hover x-coordinate is less than half the screen width from the right, then the user is assumed to be right-handed, and the overlay rectangle is anchored at the bottom right of the display. The overlay rectangle resizes with movement of the user's digit.
This resizing functionality is illustrated in
In an embodiment, the displayed material is resized to a standard size regardless of where the hover action occurs. In this embodiment the decision as to which side will be used to anchor the reduced display may be made by default or may still depend upon the location of the triggering hover action.
The user may exit the resized mode in a number of ways. For example, in an embodiment, the mobile device is configured such that a digit tap on the screen is interpreted as a request to exit the resize mode. In another embodiment, the device in configured to exit the resize mode when the user lifts his digit from the screen. When the resize mode is exited, the device is configured in an embodiment to redraw the display to the last overlay size.
After the display has been resized and anchored, as is shown in
Although the example shown in
The functions and processes described herein are executed by a computerized device, e.g., the mobile electronic device, via a processor within the device. The processor reads computer-executable instructions from a computer-readable medium and then executes those instructions to perform the appropriate tasks. The computer-executable instructions may also be referred to as “code” or a “program.” The computer-readable medium may be any non-transitory computer-readable medium.
In an embodiment, the instructions for executing the resizing and relocation functions described herein are application-agnostic. That is, the instructions are used by the device to perform global display manipulations regardless of what application or applications may be using display space. As such, in this embodiment, the various applications need not be aware of the display manipulations. Instead, they simply draw their displays as usual, and the device instructions' operating at a higher level make the changes associated with a user hover or touch event.
In keeping with the foregoing,
Once the device has entered the hover-zoom mode, the device is configured to interpret the hover distance as a request for a desired level of zoom or scale. Thus, at stage 504, the distance between the screen and the user's digit is determined, and at stage 505, the determined distance is mapped to a desired scale factor. As noted above, a smaller distance may be mapped to a larger scale factor, and a larger distance may be mapped to a smaller scale factor or vice versa. With the scale factor determined, the device resizes the displayed material and displays all or a portion of the resized material on the device screen at stage 506. As noted above, in an alternative embodiment, changes in hover distance rather than the magnitude of the distance are used to select a desired scale.
At stage 507, the device detects a user swipe or drag event, reflecting that the user has touched the display and then moved the touch point. The device interprets the detected motion as a pan command. As a result, the device translates the displayed material at stage 508 in the direction and by the amount indicated by the drag or swipe. This is referred to as panning the displayed material. In an alternative embodiment, the detected swipe event may be interpreted as panning the point of view or viewport. Thus, in either case, the process 500 allows the user to zoom and pan the display simply and easily via gesturing.
As noted above, the device may also be configured to allow gesture-based resizing and relocation of the displayed material for ease of one-handed use. The process 600 shown in
At stage 605, the device detects a user request to end the resizing mode, e.g., via a tap on the screen in the displayed area or by the user lifting the digit of interest away from the screen. Subsequently at stage 606, the device fixes the display in its last resized form, that is, with an upper corner resting at the last hover location prior to the end of the resizing mode. As noted above, the user may interact with the resized display.
At stage 607, the device detects a user command to return the display to its normal full scale, e.g., receipt of a user touch in the non-displayed area of the screen. Subsequently at stage 608, the device re-renders the display at its original full size.
It will appreciated that the disclosed principles provide a novel way of enabling user interaction with a mobile electronic device via gestures In view of the many possible embodiments to which the principles of the present discussion may be applied, it should be recognized that the embodiments described herein with respect to the drawing figures are meant to be illustrative only and should not be taken as limiting the scope of the claims. Therefore, the techniques as described herein contemplate all such embodiments as may come within the scope of the following claims and equivalents thereof.
The present application claims priority to U.S. Provisional Patent Application 61/831,639, filed on Jun. 6, 2013, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61831639 | Jun 2013 | US |