The subject matter disclosed herein relates to computer system user interfaces, and more particularly, to navigation control for a tabletop computer system.
A tabletop computer system is typically oriented to provide a substantially flat table-like work surface, i.e., parallel to floor with an upward facing display. A tabletop computer system also typically has a relatively large surface area display that is touch sensitive. The display provides a touch-sensitive user interface, where touch-based user interaction can occur at any location on the display. With larger display areas it can be cumbersome for a user to navigate the touch-sensitive user interface. Reaching across the display to touch a distant location is one challenge, while make long dragging motions on the touch-sensitive user interface is another challenge. The challenges can be greater for users having a shorter arm length or reduced physical mobility.
One aspect of the invention is a system for providing navigation control for a tabletop computer system. The system includes a multi-touch display and processing circuitry coupled to the multi-touch display. The processing circuitry is configured to display a user interface on the multi-touch display and render a navigation pane on the multi-touch display. The navigation pane includes a reduced-scale copy of the user interface. The processing circuitry is also configured to detect a touch-based input at a position on the navigation pane and determine a scaled position on the user interface corresponding to the position on the navigation pane. The processing circuitry is further configured to interpret the touch-based input at the position on the navigation pane as an equivalent touch-based input at the scaled position on the user interface and trigger an event corresponding to the equivalent touch-based input at the scaled position on the user interface.
Another aspect of the invention is a method for providing navigation control for a tabletop computer system. The method includes displaying a user interface on a multi-touch display and rendering, by processing circuitry coupled to the multi-touch display, a navigation pane on the multi-touch display. The navigation pane includes a reduced-scale copy of the user interface. The method also includes detecting, by the processing circuitry, a touch-based input at a position on the navigation pane and determining, by the processing circuitry, a scaled position on the user interface corresponding to the position on the navigation pane. The processing circuitry interprets the touch-based input at the position on the navigation pane as an equivalent touch-based input at the scaled position on the user interface. The processing circuitry triggers an event corresponding to the equivalent touch-based input at the scaled position on the user interface.
Another aspect of the invention is a computer program product for providing navigation control for a tabletop computer system. The computer program product includes a non-transitory computer readable medium storing instructions for causing processing circuitry coupled to a multi-touch display to implement a method. The method includes displaying a user interface on the multi-touch display and rendering a navigation pane on the multi-touch display. The navigation pane includes a reduced-scale copy of the user interface. The method also includes detecting a touch-based input at a position on the navigation pane and determining a scaled position on the user interface corresponding to the position on the navigation pane. The processing circuitry interprets the touch-based input at the position on the navigation pane as an equivalent touch-based input at the scaled position on the user interface. The processing circuitry triggers an event corresponding to the equivalent touch-based input at the scaled position on the user interface.
These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.
The subject matter, which is regarded as the invention, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
The detailed description explains embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.
Exemplary embodiments provide navigation control for the tabletop computer system 100. Applying touch-based inputs directly at any position on the multi-touch display 126 of the tabletop computer system 100 may become challenging for larger values of the diagonal measurement D. Exemplary embodiments, as further described herein, provide a navigation pane on the multi-touch display 126 that displays a reduced-scale copy of a user interface of the multi-touch display 126. The tabletop computer system 100 includes processing circuitry that is configured to detect touch-based input at a position on the navigation pane and determine a scaled position on the user interface corresponding to the position on the navigation pane. The touch-based input at the position on the navigation pane is interpreted as an equivalent touch-based input at the scaled position on the user interface, resulting in triggering of an event corresponding to the equivalent touch-based input at the scaled position on the user interface. Although the system described herein refers to a tabletop computer system, the system and methods described herein can apply to touch-sensitive computer systems in a variety of orientations, such as a wall-mounted computer system.
In exemplary embodiments, in terms of hardware architecture, as shown in
The processing circuitry 205 is hardware for executing software, particularly software stored in memory 210. The processing circuitry 205 can include any custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the processing system 201, a semiconductor based microprocessor (in the form of a microchip or chip set), a macroprocessor, or generally any device for executing software instructions.
The memory 210 can include any one or combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, etc.)) and nonvolatile memory elements (e.g., ROM, erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), flash memory, memory card, programmable read only memory (PROM), tape, compact disc read only memory (CD-ROM), digital versatile disc (DVD), disk, diskette, cartridge, cassette or the like, etc.). Moreover, the memory 210 may incorporate electronic, magnetic, optical, and/or other types of storage media. The memory 210 can have a distributed architecture, where various components are situated remote from one another but can be accessed by the processing circuitry 205.
Software in memory 210 may include one or more separate programs, each of which includes an ordered listing of executable instructions for implementing logical functions. In the example of
The navigation pane control 202 may be implemented in the form of a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. When a source program, then the program may be translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory 210, so as to operate properly in conjunction with the OS 211 and/or the applications 212. Furthermore, the navigation pane control 202 can be written in an object oriented programming language, which has classes of data and methods, or a procedure programming language, which has routines, subroutines, and/or functions.
In exemplary embodiments, the input/output controller 235 receives touch-based inputs from the multi-touch display 126 as detected touches, gestures, and/or movements. The multi-touch display 126 can detect input from one finger 236, multiple fingers 237, a stylus 238, and/or another physical object 239. Multiple inputs can be received contemporaneously or sequentially from one or more users. The multi-touch display 126 may also support physical object recognition using, for instance, one or more scannable code labels 242 on each physical object 239. In one example, the multi-touch display 126 includes infrared (IR) sensing capabilities to detect touches, shapes, and/or scannable code labels. Physical object 239 may be, for instance, a user identification card having an associated IR-detectable pattern for the user as one or more scannable code labels 242 to support login operations or user account and permissions configuration.
Other output devices such as the I/O devices 240, 245 may include input or output devices, for example but not limited to a printer, a scanner, a microphone, speakers, a secondary display, and the like. The I/O devices 240, 245 may further include devices that communicate both inputs and outputs, for instance but not limited to, components of a wireless interface such as a network interface card (NIC) or modulator/demodulator (for accessing other files, devices, systems, or a network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, a mobile device, a portable memory storage device, and the like.
In exemplary embodiments, the system 200 can further include a network interface 260 for coupling to a network 214. The network 214 can be an IP-based network for communication between the processing system 201 and any external server, client and the like via a broadband connection. The network 214 transmits and receives data between the processing system 201 and external systems. In exemplary embodiments, network 214 can be a managed IP network administered by a service provider. The network 214 may be implemented in a wireless fashion, e.g., using wireless protocols and technologies, such as WiFi, WiMax, etc. The network 214 can also be a packet-switched network such as a local area network, wide area network, metropolitan area network, Internet network, or other similar type of network environment. The network 214 may be a fixed wireless network, a wireless local area network (LAN), a wireless wide area network (WAN), a personal area network (PAN), a virtual private network (VPN), intranet or other suitable network system and includes equipment for receiving and transmitting signals.
If the processing system 201 is a PC, workstation, intelligent device or the like, software in the memory 210 may further include a basic input output system (BIOS) (omitted for simplicity). The BIOS is a set of essential software routines that initialize and test hardware at startup, start the OS 211, and support the transfer of data among the hardware devices. The BIOS is stored in ROM so that the BIOS can be executed when the processing system 201 is activated.
When the processing system 201 is in operation, the processing circuitry 205 is configured to execute software stored within the memory 210, to communicate data to and from the memory 210, and to generally control operations of the processing system 201 pursuant to the software. The navigation pane control 202, the OS 211, and the applications 212 in whole or in part, but typically the latter, are read by the processing circuitry 205, perhaps buffered within the processing circuitry 205, and then executed.
When the systems and methods described herein are implemented in software, as is shown in
An example of a navigation pane 400 rendered on the multi-touch display 126 is depicted in
The navigation pane 400 is sized as a scaled version of the user interface 300. For example, the navigation pane 400 may have a 1/10 scaling relative to the user interface 300. The relative scaling relationship enables scaling of positions on the navigation pane 400 to scaled positions on the user interface 300. Similarly, movements across the navigation pane 400 are scaled and applied as if they are directly made on the user interface 300. When the navigation pane 400 is actively display, users may provide inputs either through the navigation pane 400 or directly on the user interface 300. Alternatively, the tabletop computer system 100 of
In the example of
A user may select an initial position for the frame 504 and navigation pane 500 based on where a gesture to launch the navigation pane 500 is made or physical object 239 of
Multiple navigation panes may be useful where the diagonal measurement D (
At block 906, the processing circuitry 205 renders a navigation pane, such as the navigation pane 400, on the multi-touch display 126, where the navigation pane 400 is a reduced-scale copy of the user interface 300. The navigation pane may be statically positioned on the multi-touch display 126. Alternatively, the processing circuitry 205 can be configured to dynamically position the navigation pane on the multi-touch display 126, such as the dynamic positioning of the navigation pane 500 of
At block 908, the processing circuitry 205 determines whether a touch-based input is detected at a position on the navigation pane, such as touch-based input 402 at position 404 on the navigation pane 400 of
At block 910, the processing circuitry 205 determines a scaled position on the user interface corresponding to the position on the navigation pane. As described in the example of
At block 912, the processing circuitry 205 interprets the touch-based input at the position on the navigation pane as an equivalent touch-based input at the scaled position on the user interface. Continuing with the example of
At block 914, the processing circuitry 205 triggers an event corresponding to the equivalent touch-based input at the scaled position on the user interface. The triggered event is the same event that would be triggered by directly touching the user interface at the scaled position. For example, if applying a particular gesture directly at the scaled position 406 on the user interface 300 results in opening a particular context menu (not depicted), then applying the same gesture at the position 404 of the navigation pane 400 corresponding to the scaled position 406 results in opening the same context menu (not depicted) for interactive use at both the position 404 and the scaled position 406.
As previously described, the processing circuitry 205 can be configured to launch a navigation pane based on one or more of: a detected gesture on the multi-touch display 126, a detected touch of an icon on the multi-touch display 126, and a detected placement of a physical object on the multi-touch display 126. The navigation pane may be resizable with corresponding rescaling relative to the user interface as described in the example of
Multiple instances of the process 900 can operate in parallel such that additional navigation panes can be contemporaneously displayed, where the processing circuitry 205 is configured to render one or more additional navigation panes on the multi-touch display 126. An example of this is depicted in
In exemplary embodiments, a technical effect is providing navigation control for a tabletop computer system. Providing a navigation pane as a scaled version of the user interface enables a user to make physically smaller movements relative to a larger sized user interface while providing access to control features of the larger sized user interface. The navigation pane can be generated as an interactive mirrored copy of the larger sized user interface on a same multi-touch display of the tabletop computer system.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
Any combination of one or more computer readable medium(s) may be utilized including a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contains, or stores a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium as a non-transitory computer program product may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which includes one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In exemplary embodiments, where the navigation pane control 202 of
While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, modifications can incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments have been described, it is to be understood that aspects may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.