Claims
- 1. Apparatus for manipulating a first object between discontinuous source and target touch-screens of a computer comprising:a single virtual display, the first object being displayed on the source touch-screen and being known in the virtual display by unique parameters, a buffer in the computer for storing the unique parameters of the first object; means for triggering manipulation of the first object from the source touch-screen to the target touch-screen; means for releasing the first object's parameters from the buffer for display of the first object on the target touch-screen; and program means on the computer for selecting the first object upon a contact action of a pointer and the source touch-screen, for implementing the triggering manipulation means, and for implementing the releasing means.
- 2. The apparatus as recited in claim 1 wherein the program means further comprises means for recognizing a predefined contact action of the pointer upon, and restricted to, the source touch-screen.
- 3. The apparatus as recited in claim 1 wherein the program means further comprises a software latch for maintaining the first object in the buffer despite loss of contact between the pointer and the source touch-screen.
- 4. The apparatus as recited in claim 3 wherein the software latch further comprises a timer having a predetermined timeout so that the releasing means only release the first object's parameters from the buffer if actuated before the timer's timeout.
- 5. The apparatus as recited in claim 4 wherein the means for releasing the first object further comprises contact between the pointer and the target touch-screen.
- 6. The apparatus as recited in claim 2 wherein the predefined contact action of the pointer is to drag the pointer and the first object to predetermined co-ordinates while the pointer remains in contact with the source touch-screen, the arrival of the pointer at said predetermined co-ordinates activating the triggering manipulating means.
- 7. The apparatus as recited in claim 6 further comprising means for displaying a temporary second object on the virtual display, the second object representing the buffer and which is draggable on the virtual display.
- 8. The apparatus as recited in claim 7 wherein the predetermined co-ordinates on the source touch-screen arm along a boundary of the source touch-screen.
- 9. The apparatus as recited in claim 7 wherein the predetermined co-ordinates on the source touch-screen are at co-ordinates of a third object displayed on the source touch-screen.
- 10. The apparatus as recited in claim 2 wherein the predefined contact action of the pointer is a gesture, the apparatus further comprising:gesture-recognition means for comparing the gesture of the pointer on the source touch-screen with predefined gestures and wherein if a gesture is recognized as being one of the predefined gestures the triggering manipulation means is activated for manipulating the first object to the target touch-screen.
- 11. Apparatus for manipulating a first object between discontinuous source and target screens of a single virtual display of a computer, the first object being displayed on the source screen and being known in the virtual display by unique parameters, comprising:means for selecting the first object on the source screen; a buffer for storing the first objects parameters when it is selected; means associated with the source screen which, when activated by the user through a predefined motion of the pointer, upon and restricted to the source touch-screen, for manipulating the first object from the source screen to the target screen; and means, which when actuated, release the first object's parameters from the buffer for display of the first object on the target screen; microphone means for receiving voice commands and emitting digitized voice signals; and voice recognition means for receiving and recognizing digitized voice signals and for determining if a voice command is recognized as having identified a unique parameter of the first object and if a voice command is recognized as having identified a source screen and wherein the means for selecting the first object comprises determining if the identified first object is displayed on the identified source screen.
- 12. The apparatus as recited in claim 11 further comprising:an eye-tracking interface for detecting which of the source or target screens is being watched by the user; and wherein the means for selecting the first object comprise determining if the identified first object is displayed on the identified source touch-screen.
- 13. A process for manipulating a first object between discontinuous source and target touch-screens of a single virtual display of a computer, the first object being displayed on the source touch-screen and being known in the virtual display by unique parameters, the process comprising the steps of:selecting the first object from the source touch-screen when the first object is contacted by a pointer; storing the first object's unique parameters in a buffer in the computer when it is selected; applying a program on the computer to sense contact of the pointer to the touch-screens and for triggering manipulation of the first object from the source touch-screen to the target touch-screen; and releasing the first objects parameters from the buffer for display of the transferred first object to the target touch-screen.
- 14. The process as recited in claim 13 wherein the first object is manipulated to the target touch-screen by latching the first object's stored parameter's in the buffer and maintaining them therein until released to the target touch-screen despite lifting of the pointer from contact with the source touch-screen.
- 15. The process as recited in claim 14 further comprising:initiating a timer upon latching the buffer, the timer having a predetermined timeout; and releasing the first object's parameters to the target touch-screen before the timer reaches timeout.
- 16. The process as recited in claim 13 further comprising:setting a cut flag which specifies that the first object is to be deleted after release to the target touch-screen; checking the state of the cut flag upon releasing the first object's parameters to the target touch-screen; and deleting the first object from the source touch-screen if the cut flag is set.
- 17. The process as recited in claim 13 wherein the releasing of the first object's parameters from the buffer comprises touching the pointer to the target touch-screen.
- 18. The process as recited in claim 13 wherein the first object is manipulated to the target touch-screen by:defining a hot switch zone on the source touch-screen; dragging the pointer and selected first object across the source touch-screen; and impinging the first object with the hot switch zone for transferring the first objector to the target touch-screen.
- 19. The process as recited in claim 18 wherein the hot switch zone is a boundary of the source touch-screen.
- 20. The process as recited in claim 13 wherein the first object is manipulated to the target touch-screen by:dragging the pointer and first object across the source touch-screen; comparing the velocity of the dragged first object against a predetermined drag velocity and if velocity is greater than the predetermined drag velocity then the first object is transferred to the target touch-screen.
- 21. The process as recited in claim 18 further comprising:comparing a velocity of the first object when it impinges the hot switch zone against a predetermined drag velocity and if the first object's velocity is greater than the predetermined drag velocity then the first object is transferred to the target touch-screen.
- 22. The process as recited in claim 18 wherein the hot switch zone is a third object displayed on the source touch screen.
- 23. The process as recited in claim 22 further comprisingforming a virtual second object on the target touch-screen when the first object impinges the third object; mapping the source touch-screen to the display on the target touch-screen; dragging the pointer over the source touch-screen for dragging the virtual second object over the target touch-screen so that, when the first object is released, the first object is transferred to the target touch-screen to the location of the virtual second object upon release.
- 24. The process as recited in claim 22 further comprising:displaying a virtual target screen on the source touch-screen when the first object impinges the third object; and dragging the pointer over the source touch-screen for scrolling the virtual target screen progressively under the first object on the source touch-screen so that, when the first object is released, the first object is transferred to the target touch-screen to a location corresponding to where the first object was located over the virtual target screen.
- 25. The process as recited in claim 18 further comprising:displaying a menu of options when the first object impinges the hot switch zone; and selecting an option from the menu so that the first object is transferred to the target touch-screen according to the menu option.
- 26. The process as recited in claim 25 wherein one menu option is a copy option for transferring and releasing the first object to the target touch-screen while leaving a copy of the first object on the source touch-screen.
- 27. The process as recited in claim 25 wherein one menu option is a cut option for:for transferring and releasing the first object to the target touch-screen; and deleting the first object from the source touch-screen.
- 28. The process as recited in claim 13 wherein the first object is manipulated to the target touch-screen by:dragging the pointer across the source touch-screen as a gesture; comparing the gesture against pre-determined gestures so that if it matches a known pre-determined gesture then the first object is transferred onto the target touch-screen.
- 29. The process as recited in claim 28 wherein the gesture matches a pre-determined copy gesture so that the first object is transferred to the target touch-screen and when released thereto, a copy of the first object remains on the source touch-screen.
- 30. The process as recited in claim 28 wherein the gesture matches a pre-determined cut gesture so that the first object is transferred to the target touch-screen and when released thereto, the first object is deleted from the source touch-screen.
- 31. The process as recited in claim 13 further comprising the steps of:providing a wireless pointer having one or more buttons, the state of buttons being determinable; touching the pointer to the source touch-screen to select the first object; actuating a first button on the wireless pointer for latching the first object's parameters in the buffer and maintaining them there until released; touching the wireless pointer to the target touch-screen at a release point where the second object is to be dragged to; and actuating the first button for releasing the first object to the target touch-screen.
- 32. The process as recited in claim 31 further comprising:actuating a second button on the pointer for displaying a context option menu on either of the source and target touch-screens; touching the context menu for selecting a first option manipulation therefrom; touching the pointer to the target touch-screen at a location where the first object is to be released; actuating the second button on the wireless pointer for displaying the context menu; and touching the context menu for selecting a second option therefrom for transferring and releasing the first object to the target touch-screen at the released location.
- 33. The process as recited in claim 32 wherein an option from the context menu is a copy option so that when the first object is transferred and released to the target touch-screen, a copy of the first object remains on the source touch-screen.
- 34. The process as recited in claim 32 wherein an option from the context menu is a cut option so that when the first object is transferred and released to the target touch-screen, the first object is deleted from the source touch-screen.
- 35. The process as recited in claim 13 wherein the first object is selected by:providing a predetermined voice vocabulary; providing means for recognizing voice commands by comparing them with the predetermined vocabulary; receiving voice commands from the user; recognizing the voice commands for comparing the voice commands for a match against a predetermined vocabulary; determining if a vocabulary match identifies a unique parameter of an object on the touch-screen; and selecting the object as the first object if the object having the recognized unique parameter is displayed on the source touch-screen.
- 36. The process as recited in claim 13 further comprising the steps of:providing an eye-tracking interface; detecting if a touch-screen is being watched by the user using the eye-tracking interface; selecting the detected touch-screen as being the source touch-screen.
- 37. The process as recited in claim 36 wherein the first object is manipulated for transferring it from the source touch-screen to the target touch-screen by:tracking the eyes of the user as the user looks from the source touch-screen to the target touch-screen for detects a cross-discontinuity drag; and releasing the first object's parameters from the buffer for display of the transferred first object to the target touch-screen.
Parent Case Info
This is a Continuation-in-part of application Ser. No. 09/277,204, filed Mar. 26, 1999 now U.S. Pat. No. 6,331,840.
US Referenced Citations (7)
Continuation in Parts (1)
|
Number |
Date |
Country |
Parent |
09/277204 |
Mar 1999 |
US |
Child |
09/466121 |
|
US |