Computer devices can be coupled to any suitable number of display screens. In some examples, large display screens can depict a user interface of the computer device over a large area. Alternatively, multiple smaller display screens can display extensions of a common user interface. In some examples, application windows and operating system task bars can be separated by large distances. Depending on the size of the display screen or number of display screens, moving an application window can force a user to change physical locations.
The following presents a simplified summary in order to provide a basic understanding of some aspects described herein. This summary is not an extensive overview of the claimed subject matter. This summary is not intended to identify key or critical elements of the claimed subject matter nor delineate the scope of the claimed subject matter. This summary's sole purpose is to present some concepts of the claimed subject matter in a simplified form as a prelude to the more detailed description that is presented later.
An embodiment described herein includes a system for modifying a user interface includes a processor and a memory to store a plurality of instructions that, in response to an execution by the processor, cause the processor to detect a plurality of display screens electronically coupled to the system. The plurality of instructions can also cause the processor to detect a first gesture corresponding to an application window displayed in one of the display screens and generate a preview panel in response to detecting the first gesture, the preview panel to be displayed proximate the application window, wherein the preview panel is to display a real-time image comprising content displayed in each of the display screens. Additionally, the plurality of instructions can cause the processor to detect a second gesture to move the application window to a different display screen and modify the user interface to display the application window in the different display screen.
In another embodiment, a method for modifying a user interface can include detecting a plurality of display screens electronically coupled to the system and detecting a first gesture corresponding to an application window displayed in one of the display screens. The method can also include generating a preview panel in response to detecting the first gesture, the preview panel to be displayed proximate the application window, wherein the preview panel is to display a real-time image comprising content displayed in each of the display screens. Furthermore, the method can include detecting a second gesture to move the application window to a different display screen and modifying the user interface to display the application window in the different display screen.
In another embodiment, one or more computer-readable storage media for modifying a user interface can include a plurality of instructions that, in response to execution by a processor, cause the processor to detect a plurality of display screens electronically coupled to the system. The plurality of instructions can also cause the processor to detect a first gesture corresponding to an application window displayed in one of the display screens and generate a preview panel in response to detecting the first gesture, the preview panel to be displayed proximate the application window, wherein the preview panel is to display a real-time image comprising content displayed in each of the display screens. Furthermore, the plurality of instructions can cause the processor to detect a second gesture to move the application window to a different display screen and modify the user interface to display the application window in the different display screen.
The following description and the annexed drawings set forth in detail certain illustrative aspects of the claimed subject matter. These aspects are indicative, however, of a few of the various ways in which the principles of the innovation may be employed and the claimed subject matter is intended to include all such aspects and their equivalents. Other advantages and novel features of the claimed subject matter will become apparent from the following detailed description of the innovation when considered in conjunction with the drawings.
The following detailed description may be better understood by referencing the accompanying drawings, which contain specific examples of numerous features of the disclosed subject matter.
User interfaces can be generated using various techniques. For example, a user interface can include any suitable number of applications being executed, operating system features, and the like. In some embodiments, a display screen can display large user interfaces that may include application windows spread over large distances. Additionally, multiple display screens can be electronically coupled to one or more systems to provide a representation of a user interface across the multiple display screens. Accordingly, moving an application window from a first display screen to a second display screen or moving an application window within a single display screen can include drag and drop operations that force a user to move to a new physical location.
Techniques described herein provide a system for modifying a user interface. A user interface, as referred to herein, can include any suitable number of application windows, operating system features, or any combination thereof. The application windows can provide a graphical user interface for an actively executed application that is viewable via a display screen. In some embodiments, the system can detect a plurality of display screens electronically coupled to the system. The system can also detect a first gesture corresponding to an application window displayed in one of the display screens. For example, the first gesture can indicate a selection of an application window with a drag and drop gesture, or any other suitable gesture. In some embodiments, the system can also generate a preview panel in response to detecting the first gesture. In some examples, the preview panel can be displayed proximate the application window and the preview panel can display a real-time image comprising content displayed in each of the display screens. For example, the preview panel can be a smaller representation of the real-time image being displayed in the user interface provided to the plurality of display screens. In some examples, the preview panel can include a seam, such as a black region, among others, that indicates an edge of the user interface on each display screen.
In some embodiments, the system can also detect a second gesture to move the application window to a different display screen. For example, the second gesture can indicate that an application window is to be moved across the user interface based on a drag and drop gesture, or any other suitable gesture, detected within the preview panel. In some embodiments, the system can also modify the user interface to display the application window in the different display screen.
The techniques described herein enable modifying a user interface based on a preview panel. For example, the techniques enable moving an application window within a user interface from a first display screen to a second display screen. In some embodiments, the first display screen and second display screen can be connected to a single device or the first display screen and the second display screen can be connected to separate devices. For example, separate paired devices may be electronically coupled to display a single user interface.
As a preliminary matter, some of the figures describe concepts in the context of one or more structural components, referred to as functionalities, modules, features, elements, etc. The various components shown in the figures can be implemented in any manner, for example, by software, hardware (e.g., discrete logic components, etc.), firmware, and so on, or any combination of these implementations. In one embodiment, the various components may reflect the use of corresponding components in an actual implementation. In other embodiments, any single component illustrated in the figures may be implemented by a number of actual components. The depiction of any two or more separate components in the figures may reflect different functions performed by a single actual component.
Other figures describe the concepts in flowchart form. In this form, certain operations are described as constituting distinct blocks performed in a certain order. Such implementations are exemplary and non-limiting. Certain blocks described herein can be grouped together and performed in a single operation, certain blocks can be broken apart into plural component blocks, and certain blocks can be performed in an order that differs from that which is illustrated herein, including a parallel manner of performing the blocks. The blocks shown in the flowcharts can be implemented by software, hardware, firmware, and the like, or any combination of these implementations. As used herein, hardware may include computer systems, discrete logic components, such as application specific integrated circuits (ASICs), and the like, as well as any combinations thereof.
As for terminology, the phrase “configured to” encompasses any way that any kind of structural component can be constructed to perform an identified operation. The structural component can be configured to perform an operation using software, hardware, firmware and the like, or any combinations thereof. For example, the phrase “configured to” can refer to a logic circuit structure of a hardware element that is to implement the associated functionality. The phrase “configured to” can also refer to a logic circuit structure of a hardware element that is to implement the coding design of associated functionality of firmware or software. The term “module” refers to a structural element that can be implemented using any suitable hardware (e.g., a processor, among others), software (e.g., an application, among others), firmware, or any combination of hardware, software, and firmware.
The term “logic” encompasses any functionality for performing a task. For instance, each operation illustrated in the flowcharts corresponds to logic for performing that operation. An operation can be performed using software, hardware, firmware, etc., or any combinations thereof.
As utilized herein, terms “component,” “system,” “client” and the like are intended to refer to a computer-related entity, either hardware, software (e.g., in execution), and/or firmware, or a combination thereof. For example, a component can be a process running on a processor, an object, an executable, a program, a function, a library, a subroutine, and/or a computer or a combination of software and hardware. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and a component can be localized on one computer and/or distributed between two or more computers.
Furthermore, the claimed subject matter may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed subject matter. The term “article of manufacture” as used herein is intended to encompass a computer program accessible from any tangible, computer-readable device, or media.
Computer-readable storage media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, and magnetic strips, among others), optical disks (e.g., compact disk (CD), and digital versatile disk (DVD), among others), smart cards, and flash memory devices (e.g., card, stick, and key drive, among others). In contrast, computer-readable media generally (i.e., not storage media) may additionally include communication media such as transmission media for wireless signals and the like.
The system bus 108 couples system components including, but not limited to, the system memory 106 to the processing unit 104. The processing unit 104 can be any of various available processors. Dual microprocessors and other multiprocessor architectures also can be employed as the processing unit 104.
The system bus 108 can be any of several types of bus structure, including the memory bus or memory controller, a peripheral bus or external bus, and a local bus using any variety of available bus architectures known to those of ordinary skill in the art. The system memory 106 includes computer-readable storage media that includes volatile memory 110 and nonvolatile memory 112.
In some embodiments, a unified extensible firmware interface (UEFI) manager or a basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer 102, such as during start-up, is stored in nonvolatile memory 112. By way of illustration, and not limitation, nonvolatile memory 112 can include read-only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
Volatile memory 110 includes random access memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in many forms such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), SynchLink™ DRAM (SLDRAM), Rambus® direct RAM (RDRAM), direct Rambus® dynamic RAM (DRDRAM), and Rambus® dynamic RAM (RDRAM).
The computer 102 also includes other computer-readable media, such as removable/non-removable, volatile/non-volatile computer storage media.
In addition, disk storage 114 can include storage media separately or in combination with other storage media including, but not limited to, an optical disk drive such as a compact disk ROM device (CD-ROM), CD recordable drive (CD-R Drive), CD rewritable drive (CD-RW Drive) or a digital versatile disk ROM drive (DVD-ROM). To facilitate connection of the disk storage devices 114 to the system bus 108, a removable or non-removable interface is typically used such as interface 116.
It is to be appreciated that
System applications 120 take advantage of the management of resources by operating system 118 through program modules 122 and program data 124 stored either in system memory 106 or on disk storage 114. It is to be appreciated that the disclosed subject matter can be implemented with various operating systems or combinations of operating systems.
A user enters commands or information into the computer 102 through input devices 126. Input devices 126 include, but are not limited to, a pointing device, such as, a mouse, trackball, stylus, and the like, a keyboard, a microphone, a joystick, a satellite dish, a scanner, a TV tuner card, a digital camera, a digital video camera, a web camera, any suitable dial accessory (physical or virtual), and the like. In some examples, an input device can include Natural User Interface (NUI) devices. NUI refers to any interface technology that enables a user to interact with a device in a “natural” manner, free from artificial constraints imposed by input devices such as mice, keyboards, remote controls, and the like. In some examples, NUI devices include devices relying on speech recognition, touch and stylus recognition, gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence. For example, NUI devices can include touch sensitive displays, voice and speech recognition, intention and goal understanding, and motion gesture detection using depth cameras such as stereoscopic camera systems, infrared camera systems, RGB camera systems and combinations of these. NUI devices can also include motion gesture detection using accelerometers or gyroscopes, facial recognition, three-dimensional (3D) displays, head, eye, and gaze tracking, immersive augmented reality and virtual reality systems, all of which provide a more natural interface. NUI devices can also include technologies for sensing brain activity using electric field sensing electrodes. For example, a NUI device may use Electroencephalography (EEG) and related methods to detect electrical activity of the brain. The input devices 126 connect to the processing unit 104 through the system bus 108 via interface ports 128. Interface ports 128 include, for example, a serial port, a parallel port, a game port, and a universal serial bus (USB).
Output devices 130 use some of the same type of ports as input devices 126. Thus, for example, a USB port may be used to provide input to the computer 102 and to output information from computer 102 to an output device 130.
Output adapter 132 is provided to illustrate that there are some output devices 130 like monitors, speakers, and printers, among other output devices 130, which are accessible via adapters. The output adapters 132 include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device 130 and the system bus 108. It can be noted that other devices and systems of devices provide both input and output capabilities such as remote computing devices 134.
The computer 102 can be a server hosting various software applications in a networked environment using logical connections to one or more remote computers, such as remote computing devices 134. The remote computing devices 134 may be client systems configured with web browsers, PC applications, mobile phone applications, and the like. The remote computing devices 134 can be a personal computer, a server, a router, a network PC, a workstation, a microprocessor based appliance, a mobile phone, a peer device or other common network node and the like, and typically includes many or all of the elements described relative to the computer 102.
Remote computing devices 134 can be logically connected to the computer 102 through a network interface 136 and then connected via a communication connection 138, which may be wireless. Network interface 136 encompasses wireless communication networks such as local-area networks (LAN) and wide-area networks (WAN). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet, Token Ring and the like. WAN technologies include, but are not limited to, point-to-point links, circuit switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet switching networks, and Digital Subscriber Lines (DSL).
Communication connection 138 refers to the hardware/software employed to connect the network interface 136 to the bus 108. While communication connection 138 is shown for illustrative clarity inside computer 102, it can also be external to the computer 102. The hardware/software for connection to the network interface 136 may include, for exemplary purposes, internal and external technologies such as, mobile phone switches, modems including regular telephone grade modems, cable modems and DSL modems, ISDN adapters, and Ethernet cards.
The computer 102 can further include a radio 140. For example, the radio 140 can be a wireless local area network radio that may operate one or more wireless bands. For example, the radio 140 can operate on the industrial, scientific, and medical (ISM) radio band at 2.4 GHz or 5 GHz. In some examples, the radio 140 can operate on any suitable radio band at any radio frequency.
The computer 102 includes one or more modules 122, such as a display manager 142, a preview panel manager 144, and a user interface manager 146. In some embodiments, the display manager 142 can detect a plurality of display screens electronically coupled to the system. In some embodiments, the preview panel manager 144 can detect a first gesture corresponding to an application window displayed in one of the display screens. The preview panel manager 144 can also generate a preview panel in response to detecting the first gesture. In some examples, the preview panel can be displayed proximate the application window, wherein the preview panel can display a real-time image comprising content displayed in each of the display screens. An example of a preview panel is illustrated below in relation to
It is to be understood that the block diagram of
At block 202, a display manager 142 can detect a plurality of display screens electronically coupled to the system. In some embodiments, the plurality of display screens can include one or more display screens attached to a single device or one or more display screens attached to multiple devices. For example, a computing device may be electronically coupled to multiple display screens. Alternatively, a tablet computing device, a laptop device, and a mobile device may each be electronically coupled to separate display screens and a combination of the display screens for the tablet computing device, laptop device, and mobile device may be paired to display a user interface. In some embodiments, any two or more computing devices can be paired to display a user interface. For example, display screens for an augmented reality device, a projector device, a desktop computing system, a mobile device, a gaming console, a virtual reality device, or any combination therefore, can be combined to display a user interface. In some embodiments, at least one of the display screens can correspond to a virtual desktop. In some examples, one device can be coupled to a single display screen and a paired device can be coupled to multiple display screens.
At block 204, the preview panel manager 144 can detect a first gesture corresponding to an application window displayed in one of the display screens. In some embodiments, the first gesture can include a touch gesture on one of the display screens coupled to the system. In some examples, the touch gesture can include any number of fingers or any other portion of a hand or hands interacting with a display screen. For example, the touch gesture can include a one finger touch of the display screen, a two finger touch of the display screen, or any additional number of fingers touching the display screen. In some embodiments, the touch gesture can include two hands contacting a display screen within a size and shape of a region of the display screen in which a touch gesture can be detected. In some examples, the area of the region corresponds to any suitable touch of a display screen. For example, a first finger touching the display screen can indicate that additional fingers or hands touching the display screen can be considered part of the touch gesture within a particular distance from the first finger contact. In some embodiments, the touch gesture can also include a temporal component. For example, the touch gesture may include any number of fingers or hands contacting the display screen within a particular region within a particular time frame. In some examples, a delay between touching two fingers to the display screen can result in separate touch gestures being detected.
In some embodiments, the display screen can extrapolate a touch gesture based on a movement proximate a display screen. For example, the gesture detector 142 can use cameras coupled to a system to detect contactless gestures targeting portions of the display screen. The gesture detector 142 can extrapolate or determine the location of the display screen being selected based on the contactless gesture.
In some embodiments, the gesture detector 142 detects the first gesture in areas of an application window. For example, the gesture detector 142 can detect the first gesture applied to an application title bar of an application window or application window features such as a minimize feature, a maximize feature, or a close feature. In some embodiments, the gesture detector 142 registers or detects the first gesture if the first gesture is applied to one of the application window features for a period of time that exceeds a predetermined threshold. For example, the gesture detector 142 can detect a predetermined gesture applied to a minimize feature, a close feature, or a maximize feature for a period of time. In some embodiments, the first gesture corresponds to initiating a drag and drop operation on an application title bar. For example, the drag and drop operation can be initiated with a selection of the application title bar with a mouse, pointer, or any other suitable input device. In some examples, the first gesture can correspond to initiating a drag and drop operation with a representation of an application window provided by an application switcher.
At block 206, the preview panel manager 144 can generate a preview panel in response to detecting the first gesture. In some examples, the preview panel can be displayed proximate the application window, wherein the preview panel is to display a real-time image comprising content displayed in each of the display screens. For example, the preview panel can include a smaller representation of the real-time images being displayed on each of the display screens coupled to a system through one or more devices. In some embodiments, the preview panel can include representations of the actively executed applications displayed on each display screen. The preview panel can facilitate moving an application window from a first display screen to a second display screen. The preview panel can also move content from a first application window on a first display screen to a second application window on a second display screen. Example implementations of the display screen are described in greater detail below in relation to
In some embodiments, the method 200 can include generating a preview panel based on a plurality of rules corresponding to a layout of the user interface. The plurality of rules can indicate how to display a preview panel. For example, the preview panel can be generated in relation to other visual elements such as an application launcher, an application switcher, and a window list, among others. An application launcher, as referred to herein, can include a list of executable applications installed on a system, a list of recently accessed applications installed on the system, recommended applications to be installed on the system, and the like. In some examples, the application launcher can include commands that can access programs, documents, and settings. These commands can include a search function based on locally stored applications and files, a list of documents available locally on a device or on a remote server, a control panel to configure components of a device, power function commands to alter the power state of the device, and the like. An application switcher, as referred to herein, can include a link to a digital assistant, a task view illustrating all open applications, a set of icons corresponding to applications being executed, and various icons corresponding to applications and hardware features that are enabled each time a device receives power. In some embodiments, any of the features from the application switcher or application launcher can be included in a preview panel.
In some embodiments, the plurality of rules can indicate an area of a screen that is to be occupied by the preview panel. For example, the location of a preview panel may depend upon whether application windows are overlapping one another, if more than one application window is visible, and the like. For example, the preview panel can be placed above, below, left, right, or diagonal of the first gesture location. In some embodiments, the preview panel can be displayed proximate a first gesture location so that the preview panel is adjacent to a border of the display screen or application window, or centered within an application window.
At block 208, the preview panel manager 144 can detect a second gesture to move the application window to a different display screen. In some embodiments, the second gesture can include releasing a drag and drop gesture, a custom gesture, or any other suitable gesture. The custom gesture, as referred to herein, can enable a user to select multiple display screens and locations within the multiple display screens to display an application window. For example, an application window can be moved with a custom gesture from a first display screen to two or more additional display screens. In some embodiments, the custom gesture can also indicate whether the application window is to be maximized across the two or more additional display screens, displayed at the top of the two or more additional display screens, displayed at the center of the two or more additional display screens, or displayed at the bottom of the two or more additional display screens. In some embodiments, the second gesture can correspond to a seam between multiple display screens, which can indicate that the two display screens adjacent to the seam are to display the application window. In some embodiments, the second gesture can include releasing a drag and drop operation, which can be detected by an input device, such as a mouse or pointer, releasing a button that was selected as the input device moved over a distance of the user interface displayed by a display device.
At block 210, the user interface manager 146 can modify the user interface to display the application window in the different display screen. In some embodiments, the user interface manager 146 can display the application window in any suitable number of different display screens. For example, the user interface manager 146 can modify the user interface to display an application window spread across one or more display screens attached to a system. Modifying the user interface is described in greater detail below in relation to
In one embodiment, the process flow diagram of
In some embodiments, the method 200 can include hiding the preview panel in response to detecting the second gesture moving more than a predetermined distance from the preview panel. In some embodiments, the preview panel comprises actively executed applications being displayed in each of the display screens and the method 200 can include moving or pulling an application window from a remote device to a local system. In some embodiments, the method 200 can include detecting a third gesture or custom gesture within the preview panel enabling user input to determine the display screens and locations within the display screens to display the application window. In some embodiments, the plurality of display screens can include a first display screen from a first device and a second display screen from a second device. In some examples, the first device is a tablet device and the second device is a projector device. In some embodiments, the method 200 can include displaying actively executed applications in each of the display screens and moving content from the application window to a second application window in response to detecting a drag and drop gesture. In some embodiments, the method 200 can include hiding the preview panel in response to detecting a second gesture moving more than a predetermined distance from the preview panel.
In some embodiments, the preview panel 306 can be generated based on rules. For example, the rules can indicate a location and size of the preview panel based on the location of an application window displayed within a display screen. In some examples, the rules can be written in an Extensible Application Markup Language (XAML), HTML, and the like, to imperatively or declaratively describe the rules which result in the creation of the preview panel.
It is to be understood that the block diagram of
In
In
In
In
In
In
In
In
In
It is to be understood that the block diagrams of
The various software components discussed herein may be stored on the tangible, computer-readable storage media 500, as indicated in
It is to be understood that any number of additional software components not shown in
In one embodiment, a system for modifying user interfaces includes a processor and a memory to store a plurality of instructions that, in response to an execution by the processor, cause the processor to detect a plurality of display screens electronically coupled to the system. The plurality of instructions can also cause the processor to detect a first gesture corresponding to an application window displayed in one of the display screens and generate a preview panel in response to detecting the first gesture, the preview panel to be displayed proximate the application window, wherein the preview panel is to display a real-time image comprising content displayed in each of the display screens. Additionally, the plurality of instructions can cause the processor to detect a second gesture to move the application window to a different display screen and modify the user interface to display the application window in the different display screen.
Alternatively, or in addition, the plurality of display screens comprise a first display screen from a first device and a second display screen from a second device. Alternatively, or in addition, the first device is a tablet device and the second device is a projector device. Alternatively, or in addition, the plurality of instructions cause the processor to detect a drop of the application window on a seam in the preview panel and display the application window across two display screens adjacent to the seam. Alternatively, or in addition, the plurality of instructions cause the processor to detect a drop location of the application window based on a top location, a center location, or a bottom location of the different display screen of the preview panel and display the application window in the drop location. Alternatively, or in addition, the plurality of instructions cause the processor to detect a drop of the application window in the different display screen of the preview panel and display the application window in a maximized format or a minimized format in the different display screen. Alternatively, or in addition, the plurality of instructions cause the processor to detect a third gesture within the preview panel enabling user input to determine the display screens and locations within the display screens to display the application window. Alternatively, or in addition, the preview panel comprises actively executed applications being displayed in each of the display screens and wherein the processor is to move content from the application window to a second application window in response to detecting a drag and drop gesture. Alternatively, or in addition, the preview panel comprises actively executed applications being displayed in each of the display screens and wherein the processor is to move the application window from a remote device to the system. Alternatively, or in addition, the display screens comprise at least one virtual desktop. Alternatively, or in addition, the first gesture and the second gesture correspond to a drag and drop operation, wherein the first gesture is to be associated with initiating the drag and drop operation on an application title bar and the second gesture is to be associated with a release of the drag and drop operation. Alternatively, or in addition, the first gesture and the second gesture correspond to a drag and drop operation, wherein the first gesture is to be associated with initiating the drag and drop operation with a representation of an application window provided by an application switcher and the second gesture is to be associated with a release of the drag and drop operation. Alternatively, or in addition, the initiation of the drag and drop operation comprises a selection of the application title bar with an input device and the release of the drag and drop operation comprises releasing the selection from the input device in response to detecting a movement of the input device. Alternatively, or in addition, the plurality of instructions cause the processor to hide the preview panel in response to detecting the first gesture moving more than a predetermined distance from the preview panel.
In another embodiment, a method for modifying a user interface can include detecting a plurality of display screens electronically coupled to the system and detecting a first gesture corresponding to an application window displayed in one of the display screens. The method can also include generating a preview panel in response to detecting the first gesture, the preview panel to be displayed proximate the application window, wherein the preview panel is to display a real-time image comprising content displayed in each of the display screens. Furthermore, the method can include detecting a second gesture to move the application window to a different display screen and modifying the user interface to display the application window in the different display screen.
Alternatively, or in addition, the plurality of display screens comprise a first display screen from a first device and a second display screen from a second device. Alternatively, or in addition, the first device is a tablet device and the second device is a projector device. Alternatively, or in addition, the method comprises detecting a drop of the application window on a seam in the preview panel and displaying the application window across two display screens adjacent to the seam. Alternatively, or in addition, the method comprises detecting a drop location of the application window based on a top location, a center location, or a bottom location of the different display screen of the preview panel and displaying the application window in the drop location. Alternatively, or in addition, the method comprises detecting a drop of the application window in the different display screen of the preview panel and displaying the application window in a maximized format or a minimized format in the different display screen. Alternatively, or in addition, the method comprises detecting a third gesture within the preview panel enabling user input to determine the display screens and locations within the display screens to display the application window. Alternatively, or in addition, the preview panel comprises actively executed applications being displayed in each of the display screens and wherein the method comprises moving content from the application window to a second application window in response to detecting a drag and drop gesture. Alternatively, or in addition, the preview panel comprises actively executed applications being displayed in each of the display screens and wherein the method comprises moving the application window from a remote device to the system. Alternatively, or in addition, the display screens comprise at least one virtual desktop. Alternatively, or in addition, the first gesture and the second gesture correspond to a drag and drop operation, wherein the first gesture is to be associated with initiating the drag and drop operation on an application title bar and the second gesture is to be associated with a release of the drag and drop operation. Alternatively, or in addition, the first gesture and the second gesture correspond to a drag and drop operation, wherein the first gesture is to be associated with initiating the drag and drop operation with a representation of an application window provided by an application switcher and the second gesture is to be associated with a release of the drag and drop operation. Alternatively, or in addition, the initiation of the drag and drop operation comprises a selection of the application title bar with an input device and the release of the drag and drop operation comprises releasing the selection from the input device in response to detecting a movement of the input device. Alternatively, or in addition, the method comprises hiding the preview panel in response to detecting the first gesture moving more than a predetermined distance from the preview panel.
In another embodiment, one or more computer-readable storage media for modifying a user interface include a plurality of instructions that, in response to execution by a processor, cause the processor to detect a plurality of display screens electronically coupled to the system. The plurality of instructions can also cause the processor to detect a first gesture corresponding to an application window displayed in one of the display screens and generate a preview panel in response to detecting the first gesture, the preview panel to be displayed proximate the application window, wherein the preview panel is to display a real-time image comprising content displayed in each of the display screens. Furthermore, the plurality of instructions can cause the processor to detect a second gesture to move the application window to a different display screen and modify the user interface to display the application window in the different display screen.
Alternatively, or in addition, the plurality of display screens comprise a first display screen from a first device and a second display screen from a second device. Alternatively, or in addition, the first device is a tablet device and the second device is a projector device. Alternatively, or in addition, the plurality of instructions cause the processor to detect a drop of the application window on a seam in the preview panel and display the application window across two display screens adjacent to the seam. Alternatively, or in addition, the plurality of instructions cause the processor to detect a drop location of the application window based on a top location, a center location, or a bottom location of the different display screen of the preview panel and display the application window in the drop location. Alternatively, or in addition, the plurality of instructions cause the processor to detect a drop of the application window in the different display screen of the preview panel and display the application window in a maximized format or a minimized format in the different display screen. Alternatively, or in addition, the plurality of instructions cause the processor to detect a third gesture within the preview panel enabling user input to determine the display screens and locations within the display screens to display the application window. Alternatively, or in addition, the preview panel comprises actively executed applications being displayed in each of the display screens and wherein the processor is to move content from the application window to a second application window in response to detecting a drag and drop gesture. Alternatively, or in addition, the preview panel comprises actively executed applications being displayed in each of the display screens and wherein the processor is to move the application window from a remote device to the system. Alternatively, or in addition, the display screens comprise at least one virtual desktop. Alternatively, or in addition, the first gesture and the second gesture correspond to a drag and drop operation, wherein the first gesture is to be associated with initiating the drag and drop operation on an application title bar and the second gesture is to be associated with a release of the drag and drop operation. Alternatively, or in addition, the first gesture and the second gesture correspond to a drag and drop operation, wherein the first gesture is to be associated with initiating the drag and drop operation with a representation of an application window provided by an application switcher and the second gesture is to be associated with a release of the drag and drop operation. Alternatively, or in addition, the initiation of the drag and drop operation comprises a selection of the application title bar with an input device and the release of the drag and drop operation comprises releasing the selection from the input device in response to detecting a movement of the input device. Alternatively, or in addition, the plurality of instructions cause the processor to hide the preview panel in response to detecting the first gesture moving more than a predetermined distance from the preview panel.
In particular and in regard to the various functions performed by the above described components, devices, circuits, systems and the like, the terms (including a reference to a “means”) used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component, e.g., a functional equivalent, even though not structurally equivalent to the disclosed structure, which performs the function in the herein illustrated exemplary aspects of the claimed subject matter. In this regard, it will also be recognized that the innovation includes a system as well as a computer-readable storage media having computer-executable instructions for performing the acts and events of the various methods of the claimed subject matter.
There are multiple ways of implementing the claimed subject matter, e.g., an appropriate API, tool kit, driver code, operating system, control, standalone or downloadable software object, etc., which enables applications and services to use the techniques described herein. The claimed subject matter contemplates the use from the standpoint of an API (or other software object), as well as from a software or hardware object that operates according to the techniques set forth herein. Thus, various implementations of the claimed subject matter described herein may have aspects that are wholly in hardware, partly in hardware and partly in software, as well as in software.
The aforementioned systems have been described with respect to interaction between several components. It can be appreciated that such systems and components can include those components or specified sub-components, some of the specified components or sub-components, and additional components, and according to various permutations and combinations of the foregoing. Sub-components can also be implemented as components communicatively coupled to other components rather than included within parent components (hierarchical).
Additionally, it can be noted that one or more components may be combined into a single component providing aggregate functionality or divided into several separate sub-components, and any one or more middle layers, such as a management layer, may be provided to communicatively couple to such sub-components in order to provide integrated functionality. Any components described herein may also interact with one or more other components not specifically described herein but generally known by those of skill in the art.
In addition, while a particular feature of the claimed subject matter may have been disclosed with respect to one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. Furthermore, to the extent that the terms “includes,” “including,” “has,” “contains,” variants thereof, and other similar words are used in either the detailed description or the claims, these terms are intended to be inclusive in a manner similar to the term “comprising” as an open transition word without precluding any additional or other elements.
Number | Name | Date | Kind |
---|---|---|---|
5263134 | Paal et al. | Nov 1993 | A |
5455906 | Usuda | Oct 1995 | A |
5920316 | Oran et al. | Jul 1999 | A |
7802195 | Saul | Sep 2010 | B2 |
8054241 | Muklashy | Nov 2011 | B2 |
8490002 | Fai | Jul 2013 | B2 |
8847995 | Kimura | Sep 2014 | B2 |
8869065 | Mandic et al. | Oct 2014 | B2 |
8976140 | Hirata | Mar 2015 | B2 |
9460689 | Lee | Oct 2016 | B2 |
9471150 | Addaguduru | Oct 2016 | B1 |
9606723 | Selim | Mar 2017 | B2 |
9727205 | Freedman | Aug 2017 | B2 |
9798448 | Duffy | Oct 2017 | B2 |
9946373 | Graf | Apr 2018 | B2 |
10073613 | Liao et al. | Sep 2018 | B2 |
11113020 | Liu | Sep 2021 | B2 |
20030222923 | Li | Dec 2003 | A1 |
20040150668 | Myers et al. | Aug 2004 | A1 |
20060132474 | Lam | Jun 2006 | A1 |
20060200780 | Iwema et al. | Sep 2006 | A1 |
20060218499 | Matthews et al. | Sep 2006 | A1 |
20070168873 | Lentz | Jul 2007 | A1 |
20080036743 | Westerman et al. | Feb 2008 | A1 |
20080068290 | Muklashy | Mar 2008 | A1 |
20080115064 | Roach | May 2008 | A1 |
20090058842 | Bull et al. | Mar 2009 | A1 |
20090070670 | Kishi | Mar 2009 | A1 |
20090083655 | Beharie | Mar 2009 | A1 |
20090199128 | Matthews | Aug 2009 | A1 |
20090228831 | Wendker et al. | Sep 2009 | A1 |
20090235177 | Saul | Sep 2009 | A1 |
20090259967 | Davidson et al. | Oct 2009 | A1 |
20090300541 | Nelson | Dec 2009 | A1 |
20090313125 | Roh et al. | Dec 2009 | A1 |
20090327964 | Mouilleseaux et al. | Dec 2009 | A1 |
20100017744 | Kikuchi | Jan 2010 | A1 |
20100083111 | De Los Reyes | Apr 2010 | A1 |
20100083154 | Takeshita | Apr 2010 | A1 |
20100125806 | Igeta | May 2010 | A1 |
20100192102 | Chmielewski et al. | Jul 2010 | A1 |
20100214322 | Lim et al. | Aug 2010 | A1 |
20100306702 | Warner | Dec 2010 | A1 |
20110047459 | Van Der Westhuizen | Feb 2011 | A1 |
20110066980 | Chmielewski et al. | Mar 2011 | A1 |
20110148926 | Koo et al. | Jun 2011 | A1 |
20110169749 | Ganey et al. | Jul 2011 | A1 |
20110193939 | Vassigh et al. | Aug 2011 | A1 |
20110197147 | Fai | Aug 2011 | A1 |
20110202879 | Stovicek et al. | Aug 2011 | A1 |
20110209102 | Hinckley et al. | Aug 2011 | A1 |
20110239157 | Lin et al. | Sep 2011 | A1 |
20110260997 | Ozaki | Oct 2011 | A1 |
20110316807 | Corrion | Dec 2011 | A1 |
20120001945 | Oakley | Jan 2012 | A1 |
20120044164 | Kim et al. | Feb 2012 | A1 |
20120050314 | Wang | Mar 2012 | A1 |
20120050332 | Nikara et al. | Mar 2012 | A1 |
20120054671 | Thompson et al. | Mar 2012 | A1 |
20120072867 | Schlegel | Mar 2012 | A1 |
20120139815 | Aono et al. | Jun 2012 | A1 |
20120144347 | Jo | Jun 2012 | A1 |
20120223898 | Watanabe et al. | Sep 2012 | A1 |
20120289290 | Chae et al. | Nov 2012 | A1 |
20120306930 | Decker et al. | Dec 2012 | A1 |
20120320158 | Junuzovic et al. | Dec 2012 | A1 |
20120327121 | Dhawade et al. | Dec 2012 | A1 |
20130009903 | Shiota | Jan 2013 | A1 |
20130019173 | Kotler et al. | Jan 2013 | A1 |
20130019182 | Gil et al. | Jan 2013 | A1 |
20130019206 | Kotler et al. | Jan 2013 | A1 |
20130038544 | Park | Feb 2013 | A1 |
20130103446 | Bragdon et al. | Apr 2013 | A1 |
20130162569 | Sudo | Jun 2013 | A1 |
20130176255 | Kim et al. | Jul 2013 | A1 |
20130219340 | Linge | Aug 2013 | A1 |
20130237288 | Lee | Sep 2013 | A1 |
20130257777 | Benko et al. | Oct 2013 | A1 |
20130285933 | Sim et al. | Oct 2013 | A1 |
20130311954 | Minkkinen | Nov 2013 | A1 |
20130321319 | Kuramatsu | Dec 2013 | A1 |
20130321340 | Seo | Dec 2013 | A1 |
20140033119 | Kim et al. | Jan 2014 | A1 |
20140055390 | Lim et al. | Feb 2014 | A1 |
20140092140 | Wadhwa et al. | Apr 2014 | A1 |
20140160073 | Matsuki | Jun 2014 | A1 |
20140164991 | Kim | Jun 2014 | A1 |
20140168277 | Ashley et al. | Jun 2014 | A1 |
20140181739 | Yoo | Jun 2014 | A1 |
20140189583 | Yang | Jul 2014 | A1 |
20140218315 | Jeong | Aug 2014 | A1 |
20140267078 | Kukulski et al. | Sep 2014 | A1 |
20140289642 | Prasad | Sep 2014 | A1 |
20140325431 | Vranjes | Oct 2014 | A1 |
20140327626 | Harrison et al. | Nov 2014 | A1 |
20140351761 | Bae et al. | Nov 2014 | A1 |
20140365957 | Louch | Dec 2014 | A1 |
20140372926 | Szeto | Dec 2014 | A1 |
20150046871 | Lewis | Feb 2015 | A1 |
20150058808 | John et al. | Feb 2015 | A1 |
20150067552 | Leorin et al. | Mar 2015 | A1 |
20150067589 | Xiao et al. | Mar 2015 | A1 |
20150084885 | Kawamoto | Mar 2015 | A1 |
20150186016 | Li | Jul 2015 | A1 |
20150193099 | Murphy | Jul 2015 | A1 |
20150205455 | Shaw | Jul 2015 | A1 |
20150212667 | Holt et al. | Jul 2015 | A1 |
20150256592 | Young | Sep 2015 | A1 |
20150279037 | Griffin et al. | Oct 2015 | A1 |
20150319202 | Chai | Nov 2015 | A1 |
20150331594 | Terada et al. | Nov 2015 | A1 |
20150338998 | Chathoth et al. | Nov 2015 | A1 |
20150378502 | Hu et al. | Dec 2015 | A1 |
20160034157 | Vranjes et al. | Feb 2016 | A1 |
20160054881 | Yoshida et al. | Feb 2016 | A1 |
20160077650 | Durojaiye et al. | Mar 2016 | A1 |
20160077685 | Fang | Mar 2016 | A1 |
20160110076 | Reeves | Apr 2016 | A1 |
20160155410 | Nam | Jun 2016 | A1 |
20160162150 | Patel et al. | Jun 2016 | A1 |
20160162240 | Gu et al. | Jun 2016 | A1 |
20160170617 | Shi et al. | Jun 2016 | A1 |
20160179289 | Takamura et al. | Jun 2016 | A1 |
20160270656 | Samec et al. | Sep 2016 | A1 |
20160307344 | Monnier et al. | Oct 2016 | A1 |
20160334975 | Takeuchi et al. | Nov 2016 | A1 |
20170039414 | Sreenivas | Feb 2017 | A1 |
20170060319 | Seo et al. | Mar 2017 | A1 |
20170097141 | Hyodo et al. | Apr 2017 | A1 |
20170097746 | Doray et al. | Apr 2017 | A1 |
20170180678 | Fish et al. | Jun 2017 | A1 |
20170185037 | Lee et al. | Jun 2017 | A1 |
20170192733 | Huang | Jul 2017 | A1 |
20170255320 | Kumar et al. | Sep 2017 | A1 |
20170269771 | Nam et al. | Sep 2017 | A1 |
20170300205 | Villa et al. | Oct 2017 | A1 |
20170329413 | Kramer et al. | Nov 2017 | A1 |
20180196480 | Murphy | Jul 2018 | A1 |
20180203596 | Dhaliwal | Jul 2018 | A1 |
20180203660 | Hwang | Jul 2018 | A1 |
20180329508 | Klein | Nov 2018 | A1 |
20180329580 | Aurongzeb et al. | Nov 2018 | A1 |
20190129596 | Ligameri | May 2019 | A1 |
20190339854 | Wei | Nov 2019 | A1 |
20210097901 | Klein | Apr 2021 | A1 |
Entry |
---|
Shultz, Greg. “How to Juggle multiple applications using Task View in Windows 10”, Aug. 14, 2015, Tech Republic <<https://www.techrepublic.com/article/how-to-juggle-multiple-applications-using-task-view-in-windows-10/>> (Year: 2015). |
“Applicant Initiated Interview Summary Issued in U.S. Appl. No. 15/680,849”, dated Jul. 12, 2019, 05 Pages. |
“International Search Report and Written opinion Issued in PCT Application No. PCT/US18/038380”, dated Sep. 28, 2018, 12 Pages. |
“International Search Report and Written opinion Issued in PCT Application No. PCT/US18/038383”, dated Sep. 28, 2018, 12 Pages. |
“International Search Report and Written opinion Issued in PCT Application No. PCT/US18/038393”, dated Oct. 10, 2018, 17 Pages. |
“International Search Report and Written opinion Issued in PCT Application No. PCT/US18/038394”, dated Sep. 25, 2018, 11 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/680,849”, dated Jan. 10, 2019, 27 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/680,884”, dated Dec. 10, 2018, 18 Pages. |
“Final Office Action Issued In U.S. Appl. No. 15/680,849”, dated May 23, 2019, 25 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/680,713”, dated Nov. 20, 2019, 15 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/680,713”, dated Mar. 3, 2020, 16 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 15/680,713”, dated Jun. 18, 2020, 14 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/680,849”, dated Jul. 9, 2020, 33 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/680,713”, dated Dec. 4, 2020, 14 Pages. |
“Office Action Issued in European Patent Application No. 18740029.6”, dated Dec. 18, 2020, 11 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/680,849”, dated Oct. 23, 2020, 35 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 15/680,713”, dated Sep. 17, 2021, 9 Pages. |
“Summons to Attend Oral Proceedings Issued in European Patent Application No. 18740029.6”, dated Jun. 15, 2021, 16 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/680,713”, dated Mar. 29, 2021, 16 Pages. |
Number | Date | Country | |
---|---|---|---|
20190056858 A1 | Feb 2019 | US |