The present disclosure relates generally to managing a user interface state between a locked state and an unlocked state and, more particularly, to the movement of a display element between an area of display surface corresponding to a lock position and an area of the display surface corresponding to an unlock position, where the respective unlock position can include placement within a predetermined area and a rotational direction having a predetermined orientation, and where display elements can be moved from a lock position to an unlock position, as well as an unlock position to a lock position.
The use of touch sensitive interfaces, including those incorporated as part of a touch sensitive display have gained in popularity for their ease of use associated with a more intuitive interaction in accessing and controlling the functionality of an electronic device including interacting with displayed elements and/or information. Furthermore, touch sensitive displays have greatly expanded the types of user interactions which can be regarded as a valid form of input. Many interfaces have made use of these expanded opportunities to extend the types of interactions that can be defined for interacting with the device and more particularly the various applications running on the device. These interactions have been expanded to include what has sometimes been referred to as gestures. In some cases, a gesture can be as concise as a brush across the touch sensitive surface. In other instances, a gesture can trace complicated patterns and include multiple points of interaction with the surface. In at least some instances, the location at which the gesture begins can be used to select a particular one of the elements being displayed with which the user wishes to interact, and the subsequent traced movement along the surface of the display defines the nature of the interaction with the displayed element selected by the user. Still further, many interfaces have been designed to allow corresponding functionality to be performed in simple and succinct ways with a trend toward involving a minimal number of steps and/or interactions which, in essence, involves a streamlining of the interactions necessary for producing a desired effect.
Correspondingly, by increasing the types of interactions that will be viewed as a valid form of input and minimizing the number of steps to produce and/or trigger a corresponding function, there is an increased chance that an unintended interaction will coincide with an interaction from the expanded list of permissible types of gestures or interactions with the possibility that it will trigger an unintended consequence. In essence, any stray movement of a body part of the user relative to the touch sensitive surface of the display has the potential to select an item being displayed with which the user can interact, and correspondingly the nature of the movement has the potential that it will be recognized as a gesture associated with a valid function that will be acted upon, and/or may trigger an action relative to the selected item. In some cases, the stray movement which is not intended to be a purposeful interaction may be repeated in a regular fashion, which can compound or magnify the resulting interaction. For example, a user's hip or leg might brush against the display surface of the device with each step as a user walks while carrying the device. Correspondingly, each stray movement, or the repeated movements when considered together, has the potential to be treated as a valid interaction despite its unintended origins.
As such, with expanded types of interactions and a set of streamlined interactions for producing an effect, it has become increasingly likely that a user can unknowingly activate functionality on the device, such as initiate a telephone call or manipulate a stored element, such as a file, including accidentally moving, copying or erasing the same through a stray interaction. In response to this, user interface developers have implemented lock screens, which temporarily disable at least a portion of the user interface, and generally require an unlock interaction before other types of interactions will be recognized. In some cases, the lock screen will be engaged after a fixed period of inactivity during which the user has not interacted with the device. In other instances, a lock screen state can be purposely initiated by the user.
However for the same reasons that users desire more streamlined user interactions for producing desired and intended functionality, any interaction associated with the unlocking of a locked user interface should similarly avoid being overly burdensome or complex, in order to avoid the user finding the use of the feature frustrating, and correspondingly disabling the feature. Hence the challenge is to develop and provide a straight forward and intuitive interaction for unlocking a locked device which is not overly burdensome, but which also can not readily be accidently initiated.
Correspondingly, the present inventors have recognized that it would be beneficial to develop an apparatus and/or approach for transitioning between a user interface locked state and a user interface unlocked state, which is intuitive and not unduly burdensome to the user, while simultaneously reducing the risk that a stray or unintended interaction could accidently transition the device to an unlocked state without the transition to the unlocked state being the express intent of the user of the device.
The present disclosure provides among other features a user interface for an electronic device or other machine. The user interface has a touch sensitive display having a display surface, the touch sensitive display being adapted for presenting to a user at a respective position having a respective orientation at least one display element along the display surface. The touch sensitive display is further adapted for receiving from the user, a user interaction with the touch sensitive display at a location along the display surface. The user interface further includes a controller. The controller includes a user interface state module having an unlocked state and a locked state adapted for selectively enabling and disabling at least a portion of the user interface, wherein the portion of the user interface responds to a predetermined type of user interaction when in the unlocked state and does not respond to the predetermined type of user interaction when in the locked state. The controller further includes a state change module adapted for switching the state of the user interface state module between the locked state and the unlocked state. The state change module switches the state of the user interface module from the locked state to the unlocked state when the state change module detects each of the at least one display element is in a respective unlock position for the corresponding one of the at least one display element. The state change module includes an area detector and an orientation detector, wherein the respective unlock position for the corresponding one of the at least one display element includes a placement within a respective predetermined area and a rotational direction in a respective predetermined orientation. When the state change module switches the state of the user interface state module to a locked state, the state change module is adapted to respectively reposition at least one display element to a respective lock position including an area of the display surface other than within the respective predetermined area of the respective unlock position and to an orientation other than the respective predetermined orientation of the respective unlock position.
In at least one embodiment, the controller further includes a lock state interface module, where the lock state interface module is adapted to detect a received user interaction including the selection by the user of one of the at least one display element, and is further adapted to detect a further received user interaction including a postselection gesture, which moves the display element from a preselection position to a postgesture position having at least one of a placement within a new area and a new orientation.
In at least a further embodiment, the at least one of placement within a new area and a new orientation by the post selection gesture includes movement of the selected one of the at least one display element from the respective lock position to the respective unlock position, and movement of the selected one of the at least one display element from the respective unlock position to the respective lock position.
The present disclosure further provides among other features a user interface for an electronic device. The user interface has a touch sensitive display having a display surface, where the touch sensitive display is adapted for presenting to a user at a respective position a plurality of display elements along the display surface. The touch sensitive display is further adapted for receiving from the user a user interaction with the touch sensitive display at a location along the display surface. The user interface further includes a controller. The controller includes a user interface state module, which has an unlocked state and a locked state adapted for selectively enabling and disabling at least a portion of the user interface, wherein the portion of the user interface responds to a predetermined type of user interaction when in the unlocked state and does not respond to the predetermined type of user interaction when in the locked state. The controller further includes a state change module adapted for switching the state of the user interface state module between the locked state and the unlocked state. The state change module switches the state of the user interface module from the locked state to the unlocked state when the state change module detects each of the plurality of display elements in respective unlock positions for each of the corresponding display elements. The state change module includes a position detector wherein the respective unlock position for a corresponding one of the plurality of display elements includes placement within a respective predetermined position, where when the state change module switches the state of the user interface state module to a locked state, the state change module is adapted to respectively reposition at least one of the plurality of display elements to a position of the display surface other than within the respective unlock position. The controller still further includes a lock state interface module, said lock state interface module being adapted to detect a received user interaction including the selection by the user of one of the plurality of display elements, and being further adapted to detect a further received user interaction including a postselection gesture, which moves the selected one of the plurality of display elements from a preselection position to a postgesture position having a placement within a new position. In addition to being adapted to move the selected one of the plurality of display elements from the respective lock position to the respective unlock position, the lock state interface module is also adapted to move a selected one of the plurality of display elements from the respective unlock position to the respective lock position.
In at least one embodiment, when the state change module switches the state of the user interface state module to a locked state, the state change module is adapted to initially establish a position of all but one of the plurality of display elements within the respective predetermined unlock position.
The present disclosure still further provides a method for managing a state of a user interface between a locked state and an unlocked state. The method includes switching a state of the user interface from the unlocked state to the locked state. The user is then presented at least one display element via a display surface of a touch sensitive display. Each of the at least one display element is presented at a respective position having a respective orientation. When the state of the user interface is switched from the unlocked state to the locked state, the at least one display element is positioned in an area of the display surface other than a predetermined area of a respective unlock position and having a rotation other than a predetermined orientation of the respective unlock position. A repositioning of the at least one display element is then detected. The state of the user interface is then switched from the locked state to the unlocked state, when each of the at least one display element is detected in the respective unlock position of the corresponding at least one display element.
These and other objects, features, and advantages of this disclosure are evident from the following description of one or more preferred embodiments of this invention, with reference to the accompanying drawings.
While the present disclosure is susceptible of embodiments in various forms, there is shown in the drawings and will hereinafter be described presently preferred embodiments with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments illustrated. Furthermore, while the various figures are intended to illustrate the various aspects of the present invention, in doing so, the elements are not necessarily intended to be drawn to scale. In other words, the size, shape and dimensions of some layers, features, components and/or regions for purposes of clarity or for purposes of better describing or illustrating the concepts intended to be conveyed may be exaggerated and/or emphasized relative to other illustrated elements.
The touch sensitive user interface 102 often includes a touch sensitive array, which has position sensors that are adapted for detecting a position and/or proximity of a corresponding pointer device relative to the touch sensitive user interface 102. Many existing forms of touch sensitive arrays include arrays which are resistive or capacitive in nature. Still further, the touch sensitive array can even employ a force sensing element array for detecting an amount of force being applied at the selected location. In this way, a force threshold determination can be taken into account in determining an intended interaction including the selection of an interactive element, such as a display element, or the making of a gesture. However, the use of other forms of touch sensitive arrays are possible without departing from the teachings of the present disclosure.
While the pointer device can include a user's finger 104, a stylus, or any other suitable often times generally elongated element for identifying a particular area associated with the touch sensitive array, in some instances, the determination of an appropriate pointer may be affected by the particular technology used for the touch sensitive array, where in some instances a particular type of pointer may work better in conjunction with a particular type of array. In
However as noted previously not all interactions detected via the touch sensitive display may be the result of a desired action on the part of the user. In some instances an unintended interaction with the device may be made and detected proximate the touch sensitive surface of the device. As such, in some circumstances, it may be desirable to have the touch sensitive surface be in a locked state, which limits the nature and type of interactions that will be detected as a valid user input. Generally, while in a locked state the user interface will be focused on those particular actions which are intended on contributing to the transition of the user interface back to an unlocked state. The state of the user interface between a locked state and an unlocked state is managed by the controller 204. In support of this function, the controller 204 includes a user interface state module 210, which selectively enables and disables at least a portion of the user interface, including the types of interactions to which the interface will respond.
The controller further includes a state change module 212, which is adapted for switching the state of the user interface, which is managed by the user interface state module 210, between a locked state and an unlocked state. The state change module switches the state of the user interface module from the locked state to the unlocked state when the state change module detects that each of the at least one display element is in its respective unlock position, which generally includes placement within a respective predetermined area. In order to determine when the display elements are each in their respective predetermined areas of their unlock positions, the state change module includes an area detector 214. In addition to being within a respective predetermined area, in at least some instances a respective unlock position will additionally have a respective predetermined orientation. In order to determine when the display elements are each in their respective predetermined orientations of their unlock positions, the state change module further includes an orientation detector 216.
The controller 204 can in some instances still further include a lock state interface module 218 which manages the functioning of at least a portion of the device while the user interface is in a locked state. As part of that management, the lock state interface module 218 may monitor interactions with the touch sensitive surface of the display, and detect interactions with elements being displayed during the locked state of the user interface state module 210. The lock state interface module 218 further manages the elements 208 being displayed including their subsequent selection and movement including those intentionally or unintentionally prompted by the user, while the device is in a locked state.
When in a locked state, the user interface presents to the user at least one display element having a current respective position and a current respective orientation. In at least some instances, the act of unlocking may require a selection of a display element, and corresponding movement of the display element from a lock position to an unlock position. In these instances, in order to interact with the display element, the user needs to initiate a selection of the display element. Generally, the lock state interface module 218 will detect a user gesture including an attempted selection of a display element proximate the beginning point of a detected gesture, and a subsequent path that is traced by the pointer device until the tip of the pointer device is disengaged from its position proximate the surface 206 of the display. The subsequent path is sometimes referred to as a postselection portion of a gesture, and will sometimes define an action that can be used to affect the current position of the particular display element, if any, that has been selected. For example in some instances, the postselection portion of the gesture can define a displacement and corresponding path of the selected display element, where an updated position of the display element will generally correspond to the end point of the postselection portion of the gesture.
While a user can visually detect a display element's current position, unintended interactions are generally blind. Correspondingly, an unintended interaction will only select a particular display element in instances where the unintended interaction coincides with the current location of the display element, when the unintended interaction with the display surface is first detected. Furthermore, because the display element has an orientation, the orientation can be used to further define an unlock position. In such an instance, it is not sufficient to move the display element to the particular location corresponding to the unlock position, but when at the unlock position, the display element needs to have the correct orientation. Adjusting an orientation can involve one or more of several possible interactions with the display element, and in at least some instances can be a function of the particular path during the postselection portion of the gesture that is detected. In the same or other instance, an adjustment to orientation can be affected through a subsequent interaction with a display element. By layering the further feature of an orientation in addition to the feature of a particular location, in order for an unlock condition to be detected a second match with regards to orientation needs to be present at the same time a first match with regards to location occurs. The likelihood that both matches occur unintentionally at the same time is far less than the likelihood of an unintentional match occurring that relies exclusively on location.
Still further, an analysis of the path defined by the postselection portion of the gesture can be used to detect an unintentional interaction, where in these instances the particular area through which the display element is said to travel can include areas, which are to be avoided. As noted previously, because unintentional interactions are generally blind, they generally cannot purposely avoid a particular area. At least not in the same manner in which a person that is consciously controlling the movement of a display element can detect and avoid a particular area. As such in some instance, the lock state interface module 218 can include a path analyzer 220, which can include an avoid area analyzer 222.
As part of managing the movement of a display element when the user interface is in a locked state, the position and orientation of a display element can be managed by one or more gestures detected proximate the surface of the touch sensitive display. In some cases, a single gesture can affect both the position and orientation of the display element. In other instances, a particular gesture will affect position, and another separate gesture will affect orientation.
An “X” represents a virtual center of mass 308 of the display element, and a dashed line 310 extends between the center of mass 308 and the point at which the display element 302 is selected. As the display element 302 is pulled along the path corresponding to the postselection portion of the gesture, the display element can be designed to naturally rotate 310 with the center of mass of the display element rotating so as to tend to follow the traced path. In this way, a single gesture can be used to affect a change of location as well as a change of orientation, where it is equally important where the gesture ends as well as the direction of movement proximate the end of the gesture 304. Both would need to combine to result in a match with an unlock position that includes both a predetermined location and a predetermined orientation.
In at least some instances, an avoid area 408 can be used to restrict the types of valid paths that can be used to transition the display element 402 from its original lock position to an unlock position 406. For example, in some instances if the display element intersects with an avoid area 408 the user interface might interrupt the gesture currently transitioning the display element 402 to a new location, and in some instances may return the display element 402 to its preselection position. While, intuitively, the same transition needs to occur to effect an unlocking of the user interface, by requiring that the manner in which the transition takes place results in a display element having a particular location and orientation (as well as possibly needing to avoid certain paths), the number of potentially unintentional interactions that will produce a result that unlocks the device is minimized without significantly increasing the burden on the user from a conceptual and implementation viewpoint when the necessary goal to unlock the device is being purposely pursued.
In at least some instances, when the device transitions from an unlocked state to a locked state, the position of the at least one display element is randomly repositioned away from the unlock position, such that it deviates from the expected unlock position a random amount in a random direction having a rotation that differs a random amount from the orientation of the unlock position. In such an instance, the particular motion that will produce a display element being properly situated in the unlock position may be different each time. However it is not necessary for the required position, orientation and path for unlocking the device be different every time. In other words, the same or similar lock and unlock positions could be used which does not change without departing from the beneficial teachings of the present application. Furthermore, the particular lock position and unlock position including the respective locations and orientations (and avoid areas, if any) could in some instances be defined by the user.
While
While for simplicity sake, display elements in
In
The controller 902 manages the state of the user interface between a locked state and an unlocked state. In support of this function, the controller 902 includes a user interface state module 906, which is similar to the user interface state module 210 discussed above in connection with
The controller further similarly includes a state change module 908, which is adapted for switching the state of the user interface managed by the user interface state module 906, between a locked state and an unlocked state. The state change module switches the state of the user interface module from the locked state to the unlocked state when the state change module detects that each of the plurality of display elements are in their respective unlock position, which generally includes placement within a respective predetermined area. In some instances determining when the display elements are each in their respective areas of their unlock positions, can involve the state change module including an area detector 910 and/or an orientation detector 912.
The controller 204 further includes a lock state interface module 914 which manages the functioning of at least a portion of the device while the user interface is in a locked state. As part of that management, the lock state interface module 914 monitors interactions with the touch sensitive surface of the display, and detects interactions with elements being displayed during the locked state of the user interface state module 906. The lock state interface module 906 further manages the elements 904 being displayed including their subsequent selection and movement including those intentionally or unintentionally prompted by the user, while the device is in a locked state.
When in a locked state, the user interface presents to the user a plurality of display elements each having a current respective position. Some of which may correspond to a lock position, and some of which may correspond to an unlock position. The act of unlocking generally requires a selection of each of the display elements in a lock position, and a corresponding movement of that display element from a lock position to an unlock position. As noted above this can include placement within a particular area as well as can include a particular orientation.
The lock state interface module 914 in addition to managing the movement of display elements from their respective lock position to their respective unlock position, can manage the movement of a display element already in their unlock position to a position outside of their unlock position (or in other words to a lock position). While generally, a user interacting with the touch sensitive display in an effort to purposely unlock the device will only interact with the display elements that are not already in their unlock position, where the user interactions being detected by the device are unintentional, the unintentional interactions will generally not distinguish between display elements already in their unlock position and a display element in a lock position. When an unintended interaction moves a display element from an unlock position to a lock position, the device now has an additional element that needs to be transitioned back to its unlock position before the user interface of the device will transition to an unlocked state.
Generally, the default initial condition when the device is first put into a locked condition, will involve the placement of one or two display elements in a position other than their respective unlock position. That means that generally a majority of the plurality of display elements will already be in their respective unlock position. However, where a user is likely able to discern and focus on the movement of only the display elements that are not already in their unlock position, interactions that are not being purposely directed by the user tend to be blind and random, and therefore they are likely to select a display element already in an unlock position and move it out of its unlock position, because initially there are more display elements already in their unlock position. When this occurs, the device is moved further away from an unlocked condition, where not only do the display elements initially positioned in a lock position still need to be moved to their unlock position, but now at least one of the display elements that was initially in their unlock position needs to be moved back to its lock position, which in turn further decreases the chance of an inadvertent unlocking of the device.
If a user then subsequently attempts to unlock the device and the display elements have not been too scrambled through inadvertent interactions with the device, the user can move the few display elements not in their unlock position to their unlock position in order to unlock the device. However, if the display elements have been significantly scrambled through one or more inadvertent interactions, the user interface can include a reset gesture that can reset the display elements back to their initial conditions when the device was first locked with a majority of the display elements in their unlock position, and only one or a few display elements needing to be moved from a lock position to an unlock position. At least one example of a reset gesture can include touching corners (i.e. opposite corners) of the display surface at proximately the same time. Alternatively the corners of the display surface can be touched in a predetermined sequence for furnishing a reset gesture. A reset gesture can also involve another user interface element, such as a switch having a physically movable actuator.
In selecting the display element 6, the interface may give preference to selecting a display element in a lock position over an element already in an unlock position. Correspondingly, where display element 6 overlaps portions of display elements 11, 12, 15 and 16, a selection in the overlap area will select display element 6. However if another location is selected with the exception of the unlock position of display element 6 where currently no display element resides, a display element already in an unlock position will be selected with the possibility that it will be moved out of its respective unlock position. For example, if display element 8 is selected it could be moved 1006 to a place 1008 other than its respective unlock position 1010. Generally, someone purposely trying to unlock the device will not move display elements already in their respective unlock position. However under the illustrated conditions, inadvertent interactions being generally blind are more likely to select and move a display element already in an unlock position, than it is to select and move a display element in a lock position to its unlock position. If that occurs, then it effectively becomes even less likely that further inadvertent interactions will unlock the display.
In at least some embodiments, the controller 204, illustrated in
A storage element could include one or more forms of volatile and/or non-volatile memory, including conventional ROM, EPROM, RAM, or EEPROM. The storage element may still further incorporate one or more forms of auxiliary storage, which is either fixed or removable, such as a harddrive or a floppydrive. One skilled in the art will still further appreciate, that still other further forms of memory could be used without departing from the teachings of the present disclosure. In the same or other instances, the controller 204 or 906 may additionally or alternatively incorporate state machines and/or logic circuitry, which can be used to implement at least partially, some of modules and their corresponding functionality.
As noted previously, when in a locked state, at least a portion of the types of interactions that are generally allowed by the user interface are restricted. This can include all general access to the device with the exception of the actions which are interpreted in association with any perceived attempted unlocking of the device, or it can include access to one or more features or functions including access to one or more applications operating on the device. Access to these portions of the user interface will generally be restricted until the user interface is placed in an unlocked state, through the user executing a set of one or more actions relative to the device which triggers an unlocking of the user interface. In this way, unintended interactions which can trigger unintended consequences can be reduced.
While the preferred embodiments of the invention have been illustrated and described, it is to be understood that the invention is not so limited. Numerous modifications, changes, variations, substitutions and equivalents will occur to those skilled in the art without departing from the spirit and scope of the present invention as defined by the appended claims.
Number | Date | Country | |
---|---|---|---|
61513017 | Jul 2011 | US |