The invention relates to generally to electronic devices. More particularly, the invention relates to methods and devices for manipulating objects near or between discontinuities between touch sensitive screens of an electronic device.
In many applications, it is desirable to have multiple touch screen displays. For example, many electronic book readers are known to provide multiple screens. It is apparent that for devices that have discontinuous touch sensitive displays it is desirable to be able to move objects not only within one display but also between discontinuous displays. Traditionally for multiple display devices, a standard pointing device such as a mouse is manipulated on a flat continuous surface and software maps the position on the continuous surface to the entire display of multiple screens. For devices with touch sensitive displays, however, the pointing device itself must cross a discontinuity, not just the object on the screen.
It would be desirable to provide an intuitive and seamless movement of objects between screens. More particularly, it would be desirable to provide an easier, intuitive and seamless way to manipulate objects near and across a touch screen discontinuity.
One embodiment of the invention includes an electronic device with a processor, a first touch screen and a second touch screen. The first touch screen displays an object. An object transition module executed by the processor includes executable instructions to map a gesture applied to the object to a set of object movement parameters and then move the object from the first touch screen to the second touch screen in accordance with the object movement parameters.
Another embodiment of the invention includes an electronic device with a processor, individual touch screens and on object transition module executed by the processor. The object transition module includes executable instructions to map a gesture to a set of object movement parameters and thereby trigger an exchange of contents between touch screens based on the object movement parameters.
Another embodiment of the invention includes an electronic device with a processor, individual touch screens and an object transition module executed by the processor. The object transition module includes executable instructions to identify a first gesture that designates a selected object on a first screen and a second gesture that moves the content of a second screen beneath the selected object on the first screen.
The invention is more fully appreciated in connection with the following detailed description taken in conjunction with the accompanying drawings, in which:
Like reference numerals refer to corresponding parts throughout the several views of the drawings.
For example, a flick gesture includes contacting the touch screen, moving the contact point across the screen and releasing the contact with the screen before the contact point has come to rest. The parameters of this gesture can accomplish the selection of an object to be moved, the trajectory and speed of the object along the trajectory, the behavior across the discontinuity and the final position of the object. The object is animated across the screen according to the computed parameters and trajectory.
Parameters of the flick gesture such as the position of the point where contact was removed, the velocity of the contact point at the time of the release from the contact point, and the direction of the velocity of the contact point at the time of the release can be used in conjunction with a physics based algorithm that includes deceleration or damping to determine the speed and trajectory of the object. For example, physical properties may be simulated to give the visual element the appearance of behaving as a physical object might in the real world (e.g., as if the visual element were laying on a low-friction surface, and flicked with the finger). Additionally, some degree of resistance may be associated with a screen edge, so that an object flicked with insufficient velocity/momentum to make it entirely across from one display to another is not left astride the two when it comes to a stop. One possible implementation is to simulate a ‘hump’ between the adjacent panels such that a visual element that is just short of the required velocity will ‘slide back’ onto the original display, and one with just above the required velocity will continue to slide entirely onto the adjacent display.
As the object nears the boundary the object can be displayed as spilt across the boundary. For situations when the movement of the object is dragged in the traditional way, whether the object moves across the boundary can then be based not just upon the position of the touch indicator (finger, stylus, etc.) but also the boundaries of the visual display of the object. For example, if the object display does not move more than halfway into the target screen before the contact point is released, the object will not cross the screen but return to the source screen.
A flow chart of one embodiment of the flicking algorithm as well as the behavior near the edge of the panels and near the discontinuity is given in
Upon receiving a touch event 201 the process then determines whether the finger is contacting the display screen 202. If it is, then the process next determines whether the finger was contacting the display screen in the previous check 203. If the finger was not contacting the display screen in the previous check then a timer is started to determine how long the finger has been contacting the display screen 204, the object underneath the finger is determined 205 and the relative position of the finger and object is determined 206. This information is later used to determine whether the finger has moved a distance greater than a specified threshold value 208 and whether the timer has expired 209.
If the object is not floating, the finger has moved a distance greater than the threshold and the timer has expired, then the object state is set to floating 210. Once an object is floating and the finger is contacting the display and was contacting the display immediately prior, the object is displaced with the finger 211. If the object then extends beyond the display edges 212 a correction is calculated 213 and is applied to bring the object back onto the display 214. An inverse correction is applied to the calculated finger position relative to the object 215. In this manner, objects can be placed into a floating state and dragged around the display corresponding to the motion of a finger in contact with the display.
Once the finger is released from the display, such that the finger is not contacting the display but was immediately prior 216, the process next evaluates whether the object was floating 217. If the object was not floating, the gesture is treated as a standard tap 218. If the object was floating, then the process determines whether the finger was moving immediately before the contact was released 219. If the finger was moving then the flick motion vector is calculated 220, and the object position is updated based on some combination of the motion vector, position, velocity, friction etc. 221. The update is 221 is repeated after an update time period 222 until the new position is the same as the old 223. If the finger was not moving and the object straddles displays 224, then the process determines which display the object should move to 225 and calculates the motion vector to correct the object position 226. If the finger was not moving and the object did not straddle the displays then the object state is changed to not floating 227.
In addition, other objects on the screen can affect the trajectory behavior of the flicked object. For example, properties may be assigned to other on screen objects such that the on screen object can mimic the effect of gravity or magnetism on the flicked object. An on screen object can attract slow-moving flicked objects but have a lesser effect on fast-moving flicked objects. Alternatively, particular areas of the screen or on screen objects may be assigned properties such as coefficients of friction that affect the trajectory of the flicked object and make particular screen areas more likely for the final resting position of the flicked object.
In another aspect of the invention, using a gesture in a particular screen location can move the contents of one display to another, change the display mode from one screen to multiple screens or exchange the contents of adjacent displays. In one embodiment, using a flick gesture on the title bar of an application or program in one page mode can cause the application or program to switch panels if the direction of the flick is in the direction of the opposite panel. Likewise, a pinch close motion on the title bar of a two-page application can change the mode to one-page and a pinch open motion of the title bar of a one-page application can change the mode to two-page. A pinch close across the spine when two one page applications are displayed switches the two applications that are running in one-page mode. In another embodiment the pinch close could also be used to switch the pages across the spine within an application running in two-page mode.
In yet another aspect of the invention, moving objects between pages is accomplished by moving the display rather than the object. For example, the object could be held stationary on one screen while another gesture could be used to replace everything on the screen with the contents of the opposite panel, except the selected stationary object. The selected stationary object would appear on top of the new display content. This could be implemented by using a gesture to put the object into a floating mode where the object is detached from the page. While the object is in this state, other objects within the displays can be changed in the traditional manner (page turning, navigating etc.). The selected object could be reattached to a different page (either in the same application or a different one) by another gesture such as tapping on the object.
An embodiment of the present invention relates to a computer storage product with a computer readable storage medium having computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs, DVDs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (“ASICs”), programmable logic devices (“PLDs”) and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment of the invention may be implemented using JAVA®, C++, or other object-oriented programming language and development tools. Another embodiment of the invention may be implemented in hardwired circuitry in place of or in combination with, machine-executable software instructions.
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the invention. However, it will be apparent to one skilled in the art that specific details are not required in order to practice the invention. Thus, the foregoing descriptions of specific embodiments of the invention are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed; obviously, many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, they thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the following claims and their equivalents define the scope of the invention.
This application claims priority to U.S. Provisional Patent Application 61/396,789 filed Jun. 1, 2010, entitled “Electronic Device for Education”, the contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61396789 | Jun 2010 | US |