The present invention relates generally to a device and a method for supplying a user input to a hand-held device, and more particularly, to dual sided gestures performed relative to a device adapted for receiving touch input via multiple sides of the device.
With the trend for smaller hand held devices, such as cell phones, and the need to continue to generally reserve surface space for the positioning of interactive elements for purposes of enabling the user to interact with the device, the use of touch sensitive displays, which enable a device to visually convey information to a user, as well as enable a user to interact contextually with displayed object and otherwise provide user input to the device is increasingly being used. Touch sensitive displays merge input and output functions for some portable electronic devices, which in absence of the use of a similar and/or alternative form of input/output merging capability might otherwise require their own dedicated portions of the device surface. For example, many devices have historically incorporated a separate display and keypad on distinct portions of the external surface of the device.
However, some device designs have been able to extend the size of the display by extending it to include the surface space of the device that might otherwise have been separately dedicated to the location of a keypad. In some such instances, keypad-like input capabilities have been provided and/or maintained through the use of touch sensitive capabilities built into the extended display. One of the benefits of such a merger is the ability to dynamically change the size, shape and arrangement of keys, where each key can correspond to a subset of the surface space of the touch sensitive display associated therewith. Furthermore, each key can be accompanied by a visual indication, generally, through the integrated display, and more specifically the portions of the display that are currently active for providing each currently permissible form of user key selection and/or the immediately adjacent portions.
However one of the difficulties associated with touch screen displays includes the possibility that portions of the display become obstructed by one's fingers or hands in circumstances during which the user is simultaneously attempting to provide user input through the touch sensitive display interface, while one is attempting to view the information being presented via the display. Furthermore, interaction with the display with one's fingers can often leave smudges, which while they do not generally affect the operation of the device, can sometimes affect the appearance of the device, and may also impact the perceived image quality.
Consequently, some devices have incorporated touch sensitive surfaces that are located on the back side of the device, which are intended for use by the user to interact with and/or select items, which are being displayed on the front side of the device. However sometimes it can be less than clear which location on the front facing display corresponds to particular position being currently touched on the back of the device.
The use of a touch sensitive surface not only allows for the location of an interacting object, such as a pointer, to be identified by the device, but the movement of the interacting object can be similarly tracked as a function of time as the interacting object moves across the touch surface, in many instances. In this way, it may be possible to detect gestures, which can be mapped to and used to distinguish a particular type of function that may be desired to be implemented relative to the device and/or one or more selected objects. In some instances, multi-pointer gestures have been used to more intuitively identify some desired functions, such as the two finger pinching or spreading motion, which has sometimes been used to zoom in and zoom out.
However, multi-pointer gestures have generally been defined relative to a single touch sensitive input surface. Further, when one holds a device it is common for one's hand to wrap around the side of the device from the back of the device to the front of the device. Correspondingly, the present inventors have recognized that it would be beneficial to enable interactions with multiple sides of the device to be tracked for purposes of defining interactive gestures including interactive gestures involving multiple pointers, and for purposes of detecting the same. In this way some gestures can be integrated and or made more compatible with an action which is similarly intended to grip or hold an object. Still further, the present inventors have recognized that it would be beneficial if the user could more readily correlate a particular point associated with the back of the device, with which the user is currently interacting, and the corresponding point or object being displayed on the screen, which is visible via the front of the device.
The present invention provides a method of performing a dual sided gesture on respective touch sensitive surfaces of a hand held electronic device. The method includes displaying an object on a display screen of the hand held electronic device, that is viewable from at least one side of the hand held electronic device. A virtual center of gravity associated with the displayed object is then defined. Simultaneous gestures are then received tracking the position and movement of an end of a pointer on each of a pair of respective surfaces of the hand held electronic device, each surface having a corresponding touch sensitive input. The location and movement of each gesture is then compared relative to the defined virtual center of gravity, and the displayed object is repositioned in response to the location and movement of each gesture relative to the defined virtual center of gravity.
In at least one embodiment, a detected difference in the direction of movement between the two gestures relative to the virtual center of gravity will produce a rotation in the object on the display screen in a direction consistent with detected difference from a perspective of a primary viewing side of the display screen.
In at least a further embodiment, the respective surfaces include a primary side intended to be facing toward the primary user during usage and a secondary side intended to be facing away from the primary user during usage, and the method further includes receiving a gesture tracking the position and movement of an end of a pointer on only the surface corresponding to the secondary side of the hand held electronic device, that has a corresponding touch sensitive input. A display position of the displayed object is then moved laterally relative to the display screen, an amount corresponding to the detected distance and direction of movement of the end of the pointer relative to the surface of the secondary side of the hand held electronic device.
The present invention further provides a method of performing a dual sided gesture on respective touch sensitive surfaces of a hand held electronic device. The method includes displaying an object on a display screen, where the display screen includes multiple layered transparent displays including at least a primary side display, which is more proximate a primary viewing side, which is intended to be facing toward a primary user during usage, and a secondary side display, which is less proximate the primary viewing side, upon one of which the object is displayed. The object being displayed upon one of the primary side display and the secondary side display is then selected. Upon selection of an object being displayed upon the primary side display, touching the secondary side touch sensitive surface will result in the display of the object being moved from the primary side display to the secondary side display, and upon selection of an object being displayed upon the secondary side display, touching the primary side touch sensitive surface will result in the display of the object being moved from the secondary side display to the primary side display.
The present invention still further provides a hand held electronic device. The hand held electronic device includes a display screen for displaying an object viewable from at least one side of the hand held electronic device. The hand held electronic device further includes a pair of touch sensitive interfaces corresponding to opposite sides of the hand held electronic device adapted for tracking the position and movement of an end of a pointer on each of the respective touch sensitive interfaces. The hand held electronic device still further includes a user input controller. The user input controller has an object selection module for selecting an object being displayed on the display screen, and an object management module for detecting one or more gestures detected via one or more of the pair of touch sensitive interfaces, and repositioning a selected object based upon the one or more gestures, where the object management module is adapted for defining a virtual center of gravity for a selected object, detecting a simultaneous gesture on each of the pair of touch sensitive interfaces, and repositioning the displayed object in response to the location and movement of each gesture relative to the defined virtual center of gravity.
These and other objects, features, and advantages of this invention are evident from the following description of one or more preferred embodiments of this invention, with reference to the accompanying drawings.
While the present invention is susceptible of embodiment in various forms, there is shown in the drawings and will hereinafter be described presently preferred embodiments with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments illustrated. Furthermore, while the various figures are intended to illustrate the various claimed aspects of the present invention, in doing so, the elements are not necessarily intended to be drawn to scale. In other word, the size, shape and dimensions of some layers, features, components and/or regions for purposes of clarity or for purposes of better describing or illustrating the concepts intended to be conveyed may be exaggerated and/or emphasized relative to other illustrated elements.
In the particular embodiment illustrated, the front portion of the display module 12 extends across a significant portion of the front facing of the device 10 with the exception of areas 14, 16 to each of the left and the right of the display. For example to the left of the display, an area 14 incorporating a set of dedicated keys 18 is illustrated. This area 14 might correspond to the bottom of the device 10 when the device 10 is oriented in support of voice communications and can include a microphone 20, where the device might be positioned proximate the user's mouth for picking up voice signals via the microphone 20. Alternatively, the area 16 to the right of the display, which might correspond to the top of the device when oriented in support of voice communications, could include a speaker 22 for positioning proximate the user's ear for conveying reproduced audio signals, which could be encoded as part of a signal received by the device 10.
As part of the display module 12, surfaces can be incorporated coinciding with each of the front side surface of the device 10 and the back side surface of the device 10 from which visual elements can be imaged so as to be viewable by a user. The surfaces of the display module 12 coinciding with each of the front side surface of the device 10 and the back side surface of the device 10 can also respectively include a touch sensitive input array, that can be used to track the location and movement of a pointer, for example a user's finger 24 or thumb 26, as illustrated in
By incorporating a touch sensitive surface on both sides of the device, the user can interact with the device by touching one or both surfaces. This enables a user to select displayed elements, and associate a desired command or interactive effect which can be used to select and/or manipulate a particular desired displayed element, or more generically a function relative to the device, itself. The interaction with a displayed element or the device 10 can be achieved through interactions with the touch sensitive surfaces of the display module 12 from either the front or the back. With respect to some gestures or interactions with the device 10 or a displayed element, in at least some instances, the effect may be the same regardless as to whether the gesture or interaction is performed relative to the front surface or back surface of the device 10. In other instances, the particular effect associated with a particular gesture or interaction may be different depending upon the side from which the gesture is performed and correspondingly detected. In still further instances, a gesture or interaction with the device 10 can incorporate a selected positioning and movement that tracks multiple separate pointer positions on the same or alternative surfaces. In this way various different gestures can be defined, so as to enable multiple types of interactions to be performed, relative to the display module or a selected displayed element.
Given the transparent nature of the display module 12, and the fact that the display module in some instances may be intended to be seen through from one side to the other, and can accommodate the display of image elements that can be seen through portions of the device and may in some circumstances be viewed from both sides of the device, the placement of other non-display related device elements, such as communication and control circuitry, processing circuitry and energy storage elements may be somewhat restricted. More specifically device elements that are not transparent, partially transparent, and/or selectively transparent, generally may not want to be placed in an area where it is intended for the user to be able to see through the corresponding portions of the display module, otherwise they could potentially be seen and/or could obstruct the ability of the user to see through the display module and the associated portions of the device. Consequently, many of the circuit elements, that are not associated with the transparent portions of the display, are placed in the areas that do not allow for the more window-like observations through the device.
In at least some embodiments, the size of the viewable display portion of the display module on one side of the device and correspondingly the display module may be of a different size than the viewable display portion of the display module on the other side of the device. In such an instance, the viewing side surface (front or back) of the display module 12 that is larger will likely extend into areas that do not have potentially transparent see through window-like characteristics. Such areas are similarly possible in instances where one window is not necessarily larger than the other, but in instances where the two viewing sides of the display module 12 are laterally offset to produce a potentially similar affect for each of the respective viewing sides.
One of the effects of such an area for one of the viewing sides of the display module 12, which does not have a respective see through arrangement, is the ability to have portions of the display which is viewable against an opaque background, and in which the information that is being displayed for such an area for the particular side is not viewable from the other side. Such non-transparent regions can be sized and arranged to increase the overall size of the viewable display, relative to a particular side, while providing some transparency for seeing through the device 10, which can then be used to better confirm the position of a pointer interacting with the touch sensitive back surface of the device 10 and display module 12. Furthermore, the inclusion of the non-transparent regions within a given display area allows for an increase in the size of the areas, such as the left side area 14 and the right side area 16 described in connection with
Dashed lines 28, shown in
However, while the an exemplary hand held device 10 having a transparent display 12 has been shown and described, the gestures defined below in connection with the present application, can also be performed on devices having touch sensitive surfaces respectively associated with each of a pair of surfaces of the device with which the user can interact, regardless as to whether some or all of the display module 12 is transparent or not, and/or whether the display module 12 of the device 10 has window-like capabilities.
Where multiple displays are used, the general intent in some instances is to enable the possibility that elements displayed on the respective displays to be simultaneously viewable by a user in at least some operating modes or configurations. In such instances, the display elements might be viewed as being superimposed upon one another, which might give the display the appearance of some having some depth. In other instances the display might have discreet planes that are distinguishable by the user, whereby the user interaction with the displayed elements may be dependent upon the particular display upon which the corresponding element is being displayed. For example one of the displays may be associated with a foreground, and another one of the displays may be associated with a background.
In at least some instances, the displays are arranged as and/or include a plurality of separately addressable display elements, which can be separately actuated to produce a varied visual effect. In some of these instances a plurality of separately addressable elements, sometimes referred to as pixels, are arranged in a substantially planer two dimensional grid-like pattern. The pixels themselves often involve individual elements that can support at least a pair of states, that produce at least two different observable visual effects, such as a light being on or off, or an element being transparent or opaque. The visual state of multiple pixel elements can be controlled, and when viewed together can produce different visual images and effects.
A couple of examples of suitable display technologies that might be used with the present application includes an example of a non-light emitting display, such as liquid crystal type displays, or an example of a light emitting display, such as light emitting diode type displays, each of which can include individually addressable elements (i.e. pixels), that can be used to form the visual elements to be displayed. In at least one instance an organic light emitting diode display can be used. The advantage to using a light emitting type display is that a separate light source need not be used, such as backlighting or the use of a reflective back surface, for producing a user perceivable image, at least some of which would be difficult to incorporate in the context of a transparent window-like display.
On one side of the display screen 102 is a primary side touch sensitive interface 104, corresponding to a primary side of a device. On the other side of the display screen 102 is a secondary side touch sensitive interface 106, corresponding to a secondary side of the device. However, the terms primary and secondary are relative and could easily be interchanged, but together generally refer to the elements corresponding to opposite sides of the device. It is further possible that dual sided display module 100 could include still further elements, but the present description has focused on these elements as they help serve as the basis and are later referenced in connection with the discussion of some of the further features later described in the present application.
Each of the primary side touch sensitive interface 104 and the secondary side touch sensitive interface 106 can be used to detect the interaction and movement of the pointer relative to a respective surface of the device. The touch sensitive interfaces 104 and 106 can each make use of several different types of touch tracking technologies, including touch technology that is capacitive and/or resistive in nature. However depending upon the type of technology selected it may be capable of detecting different types of pointers, as well as different types of interactions with the touch sensitive interfaces 104 and 106.
In the case of capacitive-type touch sensitive interfaces, the interface can produce a detection field that can extend through a dielectric substrate, such as glass or plastic, and can be used to detect the proximity of a conductive mass that enters or disturbs the one or more fields often arranged as an array of elements in a grid-like pattern. Generally, a touch sensitive interface 104 or 106 of this type will produce a plurality of electric fields, associated with a plurality of capacitive sensors which can be sensed to determine the presence and the current location of an encroaching conductive mass that has interacted with the respective fields. Such touch sensors are sometimes referred to as proximity touch sensor arrays.
In the case of resistive-type touch sensitive interfaces, the interface includes a plurality of points often arranged as an array of elements positioned in a grid-like pattern whereby the amount of pressure being applied can be detected. In such an instance an array of elements in which the resistance will vary dependent upon the amount of force applied can be used to not only detect the presence and location of a touch, but at the same time provide an estimate to the amount of force being applied. Such touch sensors are sometimes referred to as force sensing touch sensor arrays. Because the force sensing is local relative to each detection point, a form of direct and discreet contact with the array of touch sensors may need to be possible, which often limits the opportunities for the presence of and/or the type of intervening layers.
One skilled in the art will readily recognize that there exists still further types of touch detection technologies, each having their own set of limitations and features, which can be used without departing from the teachings of the present application.
In the illustrated embodiment, a pair of arrows 216 and 218 represents a user interaction, in the form of a multiple gestures simultaneously and respectively applied to multiple touch sensitive surfaces 204 and 206. In the particular embodiment illustrated, the pair of arrows 216 and 218 indicates a tracking of movement on respective touch sensitive surfaces 204 and 206, which each move in opposite directions. Such a respective movement on each of the surfaces 204 and 206 is defined to produce a rotation of the modeled object 208, which in turn results in the visual representation of the object in 2-D space from a different angle as if the modeled object 208 had been rotated 220 about a virtual center of gravity 222.
However the view illustrated in
In some instances the center of gravity might be determined in reference to and might be based upon the dimensions of the display screen. In some of these instances, the center of gravity might coincide with the center point of the display, where the center point for purposes of determining the center of gravity may be defined relative to the size and shape of the display in one or both of the generally two dimensions across which the display extends. In other instances, the virtual center of gravity, similar to the direction and the amount of rotation, may be defined by one or more aspects of the detected gestures. For example, the virtual center of gravity, as illustrated in
In addition to being able to manipulate the visual representation of physical objects, the same interactive techniques could be applied to groupings or lists of elements, such as a list of items in a menu.
A gesture, such as a swiping motion represented by arrow 306, can be detected via the secondary side 206 of the device 200, which in turn can produce a movement of the list of elements 300 relative to the point of prominence 304, in a direction consistent with the swiping motion. In such an instance the detected motion might produce a movement of the elements in the list 300 such that the element or item coinciding with the point of prominence 304, transitions from “item 3”, as illustrated in the figure, to “item 2” and then possibly “item 1” depending upon the length or velocity of the movement corresponding to the gesture. Longer gestures or higher velocity gestures might result in a greater movement in the list 300, such that an item that is further away from the point of prominence 304 prior to the gesture being made, is moved so as to coincide with the point of prominence 304 after the gesture is made.
The point of prominence 304 might include an outline or box, which can be used to highlight the particular point. Additionally and/or alternatively, the item coinciding with the point of prominence may have text which is otherwise enlarged or highlighted. After the position of the desired item coincides with the point of prominence, a tap on the primary touch sensitive surface 204 could result in a selection of that item. In some instances, the corresponding tap could be triggered by a tap coinciding and/or positioned proximate the point of prominence.
In some instances, touching and/or user interaction with the primary side touch sensitive surface 504 or the secondary side touch sensitive surface 506, will result in displayed element being transitioned between different ones of multiple stacked displays. In other instances, the intensity of the elements being displayed on a particular one of the different displays may be affected. In any event, an ability to interact with multiple different touch sensitive surfaces can add another level of distinction to gestures that might otherwise be indistinguishable.
As a still further example, the particular touch sensitive surface with which the user interacts can be used to differentiate which one of multiple stacked objects with which the user is interacting. For example, a stack of elements would include individual elements arranged in a particular order, where interacting with the back of the device might select and manipulate items from the bottom of the stack, and interacting with the front of the device might select and manipulate items from the top of the stack.
The hand held electronic device 600 further includes a user input controller 614, which can include an object selection module 616 and an object management module 618. The object selection module 616 is adapted for selecting an object being displayed on the display screen 608. The object management module 618 is adapted for detecting one or more gestures detected via one or more of the pair of touch sensitive interfaces 604 and 606, and repositioning a selected object based upon the one or more detected gestures. In support of such a repositioning, the object management module 618 can define a virtual center of gravity 222 for a selected object or a group of selected objects, can detect simultaneous gestures on each of the pair of touch sensitive surfaces 604 and 606, and can reposition the displayed object in response to the location and movement of each gesture relative to the defined virtual center of gravity 222.
In some embodiments, the user input controller 614 could be implemented in the form of a microprocessor, which is adapted to execute one or more sets of prestored instructions 622, which may be used to form at least part of one or more controller modules 616 and 618. The one or more sets of prestored instructions 622 may be stored in a storage module 620, which is either integrated as part of the controller or is coupled to the controller 614. The storage element 620 can include one or more forms of volatile and/or non-volatile memory, including conventional ROM, EPROM, RAM, or EEPROM. The storage element 414 may still further incorporate one or more forms of auxiliary storage, which is either fixed or removable, such as a harddrive or a floppydrive. One skilled in the art will still further appreciate, that still other further forms of memory could be used without departing from the teachings of the present invention. In the same or other instances, the controller 614 may additionally or alternatively incorporate state machines and/or logic circuitry, which can be used to implement at least partially, some of modules and their corresponding functionality.
While the preferred embodiments of the invention have been illustrated and described, it is to be understood that the invention is not so limited. Numerous modifications, changes, variations, substitutions and equivalents will occur to those skilled in the art without departing from the spirit and scope of the present invention as defined by the appended claims.