The present invention relates to systems and methods for using touch input to move objects to an external display and interact with objects on an external display.
Many computer systems include a touch screen display that can detect touch input provided by a user. The touch input can be interpreted by the computer system to facilitate interaction with a graphical user interface. For example, using a computer system equipped with a touch screen display, a user can reposition a window within a desktop area of the computer system by touching the display in the area where the window's title bar is rendered and making a dragging motion to move the window to the desired location.
Touch-enabled computer systems can also be coupled to one or more external displays that are not touch-sensitive, for example to expand the computer system's desktop area or to make the computer system's graphical user interface visible to an audience during a presentation. In one exemplary arrangement, a clamshell-type laptop computer can include a touch screen on which a first portion of a desktop is displayed. The laptop computer can also be coupled to an external LCD monitor with no touch capabilities, on which a second portion of the desktop is displayed. In another exemplary arrangement, a tablet computer with a touch screen can be coupled to a projector to project an image onto a projection screen that is not touch-sensitive. The tablet's touch screen can display a first portion of a desktop, while the projection screen displays a second portion of the desktop.
One disadvantage associated with these arrangements is illustrated in
Another disadvantage associated with these arrangements is illustrated in
U.S. Patent Application Publication No. 2005/0015731 to Mak et al., the entire contents of which are incorporated herein by reference, discloses a computer system in which a desktop area is spread across multiple displays. In the Mak system, a first desktop portion is displayed using a first display, and a second desktop portion is displayed using a second display. A “jump pane” window is shown in the first desktop portion on the first display, in which a reproduction of the second desktop portion is displayed. In other words, the second desktop portion is not only shown on the second display, but also is mirrored in reduced form to a window on the first display. Using only the first display, a user can operate a stylus to drag objects into the “jump pane,” causing them to appear in the second desktop portion on the second display.
In the Mak system, however, the jump pane occupies a significant portion of the first display, wasting valuable desktop and display area. In addition, the jump pane is generally much smaller than the second display, which can make text or icons shown in the jump pane illegible. Even when text and icons shown in the jump pane are legible, they represent very small touch targets that require a high degree of accuracy to select, move, etc. The jump pane also increases the complexity of the user interface and reduces its intuitiveness, as it is not necessarily clear to the user that moving an object to the jump pane will cause it to move to the second display. Displaying the same content in two different locations also can be confusing or distracting to the user.
In view of these and other shortcomings, a need exists for improved systems and methods for using touch input to move objects to an external display and interact with objects on an external display.
In one aspect of at least one embodiment of the invention, a method is provided that includes displaying a first portion of a desktop on a first display device, displaying a second portion of the desktop on a second display device, and moving a first window from the first portion of the desktop to the second portion of the desktop in response to a send touch gesture that originates in the first window. The method can also include, after moving the first window, displaying a first control tab corresponding thereto on the first portion of the desktop at an edge of the first display device.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, in which the send touch gesture comprises a flick gesture in the direction of the second display device.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, in which the edge of the first display device is an edge that is most proximate to the second display device.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, in which the first display device is a touch screen display device and the second display device is not a touch screen display device.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, that includes decorating the first window and the first control tab with a corresponding label.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, in which the corresponding label comprises at least one of a color, a text label, and an image label.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, in which the first control tab is displayed without displaying a reproduction of the second portion of the desktop on the first portion of the desktop.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, that includes moving the first window from the second portion of the desktop to the first portion of the desktop in response to a retrieve touch gesture that originates in the first control tab.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, in which the retrieve touch gesture comprises a drag gesture.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, that includes displaying a representation of the first window and the second display device on the first portion of the desktop in response to a move/resize touch gesture performed on the first control tab. The method can also include receiving a touch gesture performed on the representation, the touch gesture being indicative of a move instruction or a resize instruction, and moving or resizing the first window within the second portion of the desktop in response to the touch gesture performed on the representation. It will be appreciated that a move/resize touch gesture can be any of a variety of gestures, including without limitation a tap gesture, double tap gesture, drag gesture, pinch gesture, spread gesture, or any of a number of custom gestures.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, that includes moving a plurality of windows from the first portion of the desktop to the second portion of the desktop in response to a plurality of send gestures, each of the plurality of send gestures originating in a corresponding one of the plurality of windows. The method can also include displaying a plurality of control tabs on the first portion of the desktop at an edge of the first display device, each of the plurality of control tabs corresponding to one of the plurality of windows.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, that includes automatically arranging the plurality of windows within the second portion of the desktop after they are moved to the second portion of the desktop.
Related aspects of at least one embodiment of the invention provide a method, e.g., as described above, that includes receiving a select touch gesture performed on one of the plurality of control tabs and, in response to the select touch gesture, bringing a window positioned on the second portion of the desktop that corresponds to the control tab on which the select touch gesture was performed to the front and giving the window focus.
In another aspect of at least one embodiment of the invention, a system is provided that includes one or more microprocessors, the one or more microprocessors being programmed to provide a desktop display module configured to display a first portion of a desktop on a first display device and a second portion of the desktop on a second display device. The one or more microprocessors can also be programmed to provide a touch gesture processing module configured to receive touch gestures in the form of information indicative of touch input performed by a user, and a window control module configured to move a first window from the first portion of the desktop to the second portion of the desktop in response to a send touch gesture that originates in the first window and that is received by the touch gesture processing module. The one or more microprocessors can also be programmed to provide a control tab display module configured to display a first control tab corresponding to the first window on the first portion of the desktop at an edge of the first display device after the first window is moved by the window control module.
Related aspects of at least one embodiment of the invention provide a system, e.g., as described above, in which the one or more processors are programmed to provide an interface decoration module configured to decorate the first window and the first control tab with a corresponding label, the corresponding label comprising at least one of a color, a text label, and an image label.
Related aspects of at least one embodiment of the invention provide a system, e.g., as described above, in which the control tab display module is configured to display the first control tab without displaying a reproduction of the second portion of the desktop on the first portion of the desktop.
Related aspects of at least one embodiment of the invention provide a system, e.g., as described above, in which the window control module is configured to move the first window from the second portion of the desktop to the first portion of the desktop in response to a retrieve touch gesture that originates in the first control tab and that is received by the touch gesture processing module.
Related aspects of at least one embodiment of the invention provide a system, e.g., as described above, in which the one or more microprocessors are programmed to provide a representation display module configured to display a representation of the first window and the second display device on the first portion of the desktop in response to a move/resize touch gesture received by the touch gesture processing module. The window control module can be configured to move or resize the first window within the second portion of the desktop in response to a touch gesture performed on the representation, the touch gesture being indicative of a move instruction or a resize instruction and being received by the touch gesture processing module.
Related aspects of at least one embodiment of the invention provide a system, e.g., as described above, in which the window control module is configured to move a plurality of windows from the first portion of the desktop to the second portion of the desktop in response to a plurality of send gestures received by the touch gesture processing module, each of the plurality of send gestures originating in a corresponding one of the plurality of windows. The control tab display module can be configured to display a plurality of control tabs on the first portion of the desktop at an edge of the first display device, each of the plurality of control tabs corresponding to one of the plurality of windows.
Related aspects of at least one embodiment of the invention provide a system, e.g., as described above, in which the window control module is configured to automatically arrange the plurality of windows within the second portion of the desktop after they are moved to the second portion of the desktop.
Related aspects of at least one embodiment of the invention provide a system, e.g., as described above, in which the window control module is configured, in response to a select touch gesture performed on one of the plurality of control tabs and received by the touch gesture processing module, to bring a window positioned on the second portion of the desktop that corresponds to the control tab on which the select touch gesture was performed to the front and to give the window focus.
In another aspect of at least one embodiment of the invention, a non-transitory computer-readable storage medium having a program stored thereon is provided. The program can be configured to cause a microprocessor to execute a desktop display function that causes a first portion of a desktop to be displayed on a first display device and a second portion of the desktop to be displayed on a second display device. The program can also be configured to cause the microprocessor to execute a touch gesture processing function that receives touch gestures in the form of information indicative of touch input performed by a user, and a window control function that moves a first window from the first portion of the desktop to the second portion of the desktop in response to a send touch gesture that originates in the first window and that is received by the touch gesture processing function. The program can also be configured to cause a microprocessor to execute a control tab display function that displays a first control tab corresponding to the first window on the first portion of the desktop at an edge of the first display device after the first window is moved by the window control function.
The present invention further provides devices, systems, and methods as claimed.
The invention will be more fully understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
Certain exemplary embodiments will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the methods, systems, and devices disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. Those skilled in the art will understand that the methods, systems, and devices specifically described herein and illustrated in the accompanying drawings are non-limiting exemplary embodiments and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments. Such modifications and variations are intended to be included within the scope of the present invention.
Systems and methods are disclosed herein that generally involve allowing movement of windows or other user interface objects back and forth between a touch screen display and a non-touch display using only touch inputs. In one embodiment, a “send” touch gesture performed on a window displayed on the touch screen display causes automatic movement of the window to the non-touch display. A tab corresponding to the moved window is then displayed on the touch screen display. The tab can be used to interact with the window using touch inputs, even though the window has been moved to a non-touch display. For example, a “retrieve” touch gesture can be performed on the tab to move the window back to the touch screen display, or a “select” touch gesture can be performed on the tab to bring the moved window to the front and give the moved window focus. Systems and methods are also disclosed that allow movement and manipulation of windows or other objects displayed on any number external or auxiliary displays using only touch inputs applied to a primary display.
It will be appreciated that the systems and methods disclosed herein can be implemented using one or more computer systems. The term “computer system” as used herein refers to any of a variety of digital data processing devices, including personal computers, desktop computers, laptop computers, tablet computers, server computers, cell phones, PDAs, gaming systems, televisions, radios, portable music players, and the like. The systems and methods disclosed herein can also be implemented in part or in full using software, which can be stored as an executable program or programs on one or more non-transitory computer-readable storage mediums. The term “external display” as used herein can refer to displays that are mounted in a chassis or package that is physically separate from other displays in the system, as well as to displays that are mounted in the same chassis or package as other displays in the system. Thus, in a system that includes multiple displays in a single chassis or package, one or more of the displays can be considered “external,” despite being mounted in the same unit as a primary or other display.
The illustrated computer system 200 includes a processor 208 which controls the operation of the computer system 200, for example by executing an operating system (OS), a basic input/output system (BIOS), device drivers, application programs, and so forth. The processor 208 can include any type of microprocessor or central processing unit (CPU), including programmable general-purpose or special-purpose microprocessors and/or any one of a variety of proprietary or commercially-available single or multi-processor systems. The computer system 200 also includes a memory 210, which provides temporary storage for code to be executed by the processor 208 or for data that is processed by the processor 208. The memory 210 can include read-only memory (ROM), flash memory, one or more varieties of random access memory (RAM), and/or a combination of memory technologies. The various elements of the computer system 200 are coupled to a bus system 212. The illustrated bus system 212 is an abstraction that represents any one or more separate physical busses, communication lines/interfaces, and/or multi-drop or point-to-point connections, connected by appropriate bridges, adapters, and/or controllers.
The computer system 200 also includes a network interface 214, an input/output (IO) interface 216, a storage device 218, and a display controller 220. The network interface 214 enables the computer system 200 to communicate with remote devices (e.g., other computer systems) over a network. The IO interface 216 facilitates communication between one or more input devices (e.g., touch screens, keyboards, or pointing devices), one or more output devices (e.g., speakers, printers, or removable memories), and the various other components of the computer system 200. The storage device 218 can include any conventional medium for storing data in a non-volatile and/or non-transient manner. The storage device 218 can thus hold data and/or instructions in a persistent state (i.e., the value is retained despite interruption of power to the computer system 100). The storage device 218 can include one or more hard disk drives, flash drives, USB drives, optical drives, various media disks or cards, and/or any combination thereof and can be directly connected to the other components of the computer system 200 or remotely connected thereto, such as over a network. The display controller 220 includes a video processor and a video memory, and generates images to be displayed on one or more displays in accordance with instructions received from the processor 208.
The computer system 200 also includes a first display 204 that is capable of receiving touch input from a user (i.e., a touch screen display), for example by detecting the presence and location of a touch event that occurs within a display area 222 of the first display 204. Any of a variety of touch screen display technologies can be used by the first display 204, including capacitive, resistive, optical imaging, infrared, and/or surface acoustic wave (SAW) systems. The first display 204 is coupled to the display controller 220, which provides images to be displayed on the first display 204. The first display 204 is also coupled to the IO interface 216 such that touch inputs performed on or recognized or detected by the first display 204 can be received and processed by the processor 208. Software executed by the processor 208 can recognize or interpret touch inputs as any of a variety of predetermined gestures, such as a tap gesture, a multi-tap gesture, a flick gesture, a drag gesture, a tap and hold gesture, a pinch gesture, a spread gesture, and so forth.
The computer system 200 also includes a second display 206 that is not capable of receiving touch input from a user (i.e., a non-touch display). Exemplary second displays include LCD monitors, CRT monitors, television screens, projection screens, and the like. The second display 206 is also coupled to the display controller 220, which provides images to be displayed on the second display 206. In an exemplary system in which a laptop computer is coupled to an external monitor, the laptop's integrated touch screen display can be considered the first display and the external monitor can be considered the second display.
One or more software modules can be executed by the computer system 200 to facilitate human interaction with the computer system 200. These software modules can be part of a single program or one or more separate programs, and can be implemented in a variety of contexts (e.g., as part of an operating system, a device driver, a standalone application, and/or combinations thereof). It will be appreciated that functions disclosed herein as being performed by a particular module can also be performed by any other module or combination of modules.
In the illustrated embodiment, a desktop display module displays a graphical user interface that includes a desktop area in which various windows and other objects can be displayed. The desktop area can be spread across the touch screen display 204 and the non-touch display 206 such that a first portion 224 of the desktop is displayed on the touch screen display 204 and a second portion 226 of the desktop is displayed on the non-touch display 206. In operation, a user can manipulate objects 202 in the graphical user interface by providing touch inputs to the touch screen display 204. A touch gesture processing module can detect, receive, and/or interpret touch input provided by a user, or information indicative of such touch input. The graphical user interface can then be manipulated in accordance with the touch input, either by the touch gesture processing module or one or more other modules.
As shown in
As shown in
When a window 202 is moved to the non-touch display 206, its size, position, and/or other properties can be automatically adjusted based on any of a variety of predetermined behaviors, which can optionally be user-configurable. For example, the window 202 can be automatically centered, left-aligned, right-aligned, top-aligned, bottom-aligned, tiled, layered, maximized, minimized, brought to the front, sent to the back, etc. upon being moved to the non-touch display 206. In addition, the movement of the window 202 to the non-touch display 206 can automatically cause the window 202 to be given focus for keyboard or other input, or to automatically lose focus.
The illustrated system 200 thus permits a window 202 to be completely moved onto a non-touch display 206 (or any other type of display) using only touch input.
As shown in
As shown in
As shown in
In some cases, the window control module may be configured to launch new windows directly to the non-touch display 206, such as when a new application is launched by a user or a new document is opened or created within an application. In these instances, the control tab display module can be configured to automatically draw a tab corresponding to the new window on the touch screen display 204. The control tab display module can thus ensure that all windows shown on non-touch displays have a corresponding tab shown on the touch screen display, such that an ability to interact with such windows using only touch inputs is preserved.
As shown in
In addition to color-coding, or as an alternative thereto, the tabs and windows can be provided with text labels to visually display the correspondence relationships therebetween. In the embodiment illustrated in
“Word Processor” is displayed in a window on the non-touch display, its corresponding tab can be labeled with “Word Processor.” By way of further example, when a text file named “document1.txt” is displayed in a window on the non-touch display, its corresponding tab can labeled with “document1.txt.”
When multiple windows are moved to the same non-touch display 206, the window control module can be configured to automatically arrange the windows or adjust the size, position, and/or other properties of the windows automatically based on any of a variety of predetermined behaviors. These behaviors can be user-configurable and can be different from the behaviors used when only a single window is moved to the non-touch display 206. As shown in
The touch gesture processing module can also be configured to recognize a “select” operation to permit a user to give focus to a window displayed on a non-touch display 206 or to remove focus from a window displayed on a non-touch display 206 using only touch input. The select operation can also automatically bring a window to the front when focus is applied thereto, or automatically send a window to the back when focus is removed therefrom. In one embodiment, the select operation includes performing a predetermined touch gesture on the control tab corresponding to the window that the user wishes to apply focus to or remove focus from (e.g., a single tap gesture). As shown in
The control tabs 230 that are displayed on the touch screen display 204 can optionally be provided with buttons or other controls for manipulating their corresponding windows. For example, each tab can be provided with one or more of a maximize button, a minimize button, a close button, a move button, a resize button, etc. such that these functions can be performed on the corresponding window 202 using only touch input, even through the window 202 is displayed on the non-touch display 206. Instead of providing buttons on the tabs 230 to perform these functions, or in addition thereto, the touch gesture processing module can be configured to associate various touch gestures with these functions. For example, a double tap gesture can be interpreted as a “close window” instruction, or a pinch gesture can be interpreted as a “resize window” instruction.
As shown in
Once the fly-out 252 is displayed, the touch gesture processing module can detect user input within the fly-out, determine whether any of a variety of predetermined size and position adjustment operations have been performed, and instruct the window control module to adjust the size and position of the corresponding window 202 accordingly. For example, a drag gesture that originates within the wireframe representation 256 of the window 202 can be recognized as a move operation. A drag gesture that originates on a top or bottom edge of the wireframe representation 256 can be recognized as an adjust vertical size instruction, and a drag gesture that originates on a right or left edge of the wireframe representation 256 can be recognized as an adjust horizontal size instruction. A drag gesture that originates on a corner of the wireframe representation 256 can be recognized as an adjust vertical and horizontal size instruction. Pinch and spread gestures can be recognized as an enlarge window instruction and a reduce window instruction, respectively. When the user is finished resizing and/or repositioning the window, the fly-out 252 can be dismissed, for example by touching the touch screen display 204 in an area outside of the fly-out 252, by touching a close or cancel button provided on the fly-out 252, or by allowing a predetermined time to elapse without providing touch input to the fly-out 252.
One exemplary method of operation of the computer system 200 is illustrated schematically in the flow chart of
As shown, operation begins at a starting point S300. The system then determines at decision block D302 whether a touch event has occurred. If no touch event has occurred, the system passes a hook to the operating system or other underlying software at step S304 and returns to the starting point S300.
If it is determined at decision block D302 that a touch event has occurred, the system then determines at decision block D306 whether the touch event is a flick gesture. If the touch event is a flick gesture, the system determines at decision block D308 whether the touch began inside a window. If the touch did not begin inside a window, the system passes a hook at step S304 and returns to the starting point S300. If it is determined at decision block D308 that the touch began inside a window, the system determines the direction of the flick gesture at step S310, and then determines whether an external display is positioned in the direction of the flick gesture at decision block D312. If there is no display physically positioned in the direction of the flick gesture, the system passes a hook at step S304 and returns to the starting point S300. If it is determined at decision block D312 that a display is physically positioned in the direction of the flick, the window position is translated to the external display at step S314. The window is then decorated at step S316 (e.g., by adding a color frame or text label), and a tab is created at the edge of the touch screen display with a corresponding decoration at step S318. The system then returns to the starting point S300.
If it is determined at decision block D306 that the touch event is not a flick gesture, the system determines at decision block D320 whether the touch event is a tap gesture or a tap and hold gesture. If the touch event is a tap gesture or a tap and hold gesture, the system determines at step D322 whether the touch event occurred inside a tab. If the touch event did not occur inside a tab, the system passes a hook at step S304 and returns to the starting point S300. If it is determined as decision block D322 that the touch event did occur inside a tab, the window corresponding to the tab is brought to the front at step S324, and the corresponding window is given focus at step S326. The system then returns to the starting point S300.
If it is determined at decision block D320 that the touch event is not a tap gesture or a tap and hold gesture, the system determines at decision block D328 whether the touch event is a drag gesture. If the touch event is a drag gesture, the system determines at decision block D330 whether the drag gesture begin inside a tab. If the drag gesture did not begin inside a tab, the system passes a hook at step S304 and returns to the starting point S300. If it is determined at decision block D330 that the drag gesture did begin inside a tab, the window corresponding to the tab is hidden at step S332. The tab is then moved in concert with the drag gesture at step S334 and it is determined whether the drag gesture has ended at decision block D336. If the drag gesture has not yet ended, the system returns to step S334 and thus continues to move the tab in concert with the drag gesture. This process repeats until the drag gesture ends. When it is determined at decision block D336 that the drag gesture has ended, the tab is destroyed at step S338 and the hidden window is repositioned to the touch screen display and unhidden at step S340. The system then returns to the starting point S300.
If it is determined at decision block D328 that the touch event is not a drag gesture, the system determines at decision block D342 whether the touch event is a size/position gesture. If the touch event is a size/position gesture, the system determines at decision block D344 whether the touch began inside a tab. If the size/position gesture did not begin inside a tab, the system passes a hook at step S304 and returns to the starting point S300. If it is determined at decision block D344 that the touch event began inside a tab, the size/position fly-out is displayed at step S346. The window corresponding to the tab is then resized and/or repositioned based on user input to the fly-out at step S348. The system then determines whether the fly-out has been dismissed at decision block D350. If the fly-out has not been dismissed, the system returns to step S348 and thus continues to resize and/or reposition the window in accordance with user input. This process repeats until the fly-out is dismissed. When it is determined at decision block D350 that the fly-out has been dismissed, the fly-out is destroyed at step S352 and the system returns to the starting point S300.
If it is determined at decision block D342 that the touch event is not a size/position gesture, the system passes a hook at step S304 and returns to the starting point S300.
Although the invention has been described by reference to specific embodiments, it should be understood that numerous changes may be made within the spirit and scope of the inventive concepts described.
For example, while systems and methods are disclosed above in which control tabs 230 are displayed on the touch screen display 204, any of a variety of other graphical objects can be used instead or in addition, such as icons, buttons, and the like. Furthermore, the objects need not necessarily be positioned at an edge of the touch screen display 204.
By way of further example, the systems and methods disclosed herein are not limited to manipulating windows, but rather can be used to manipulate any of a variety of user interface objects, such as text, icons, images, controls, etc.
Also, while systems and methods are disclosed herein that involve one touch screen display 204 and one non-touch display 206, such systems and methods can also include any combination of one or more touch screen displays and one or more non-touch displays, or any combination of two or more touch screen displays and zero or more non-touch displays. Thus, exemplary configurations can include a configuration having one touch screen display and two non-touch displays, a configuration having two touch screen displays and zero non-touch displays, a configuration having three touch screen displays and three non-touch displays, and so forth. In configurations with more than one touch screen display, the control tabs can be displayed on the touch screen display on which the corresponding send operation is performed. In configurations with a primary touch screen display and more than one secondary or external display (whether touch screen displays, non-touch displays, or a combination thereof), windows can be sent to the display that is physically positioned in the direction of a gesture constituting the send operation. In such configurations, the controls tabs can be positioned along an edge of the primary display that is most proximate to the secondary display to which a window has been sent.
As a further example, while systems and methods are disclosed that contemplate touch gestures applied directly to a touch screen display, such systems and methods can also operate using gestures performed using a touch pad, a mouse, a roller ball, a joystick, a keyboard, etc.
Accordingly, it is intended that the invention not be limited to the described embodiments, but that it have the full scope defined by the language of the following claims.