1. Technical Field
The present invention(s) generally relate to graphical user interfaces (GUIs) and, more particularly, relate to selection or placement of graphical objects through a GUI, such as one provided by a touch-enabled computing device.
2. Description of Related Art
With touch-enabled computing devices, such as tablets, smart phones, and like, the user typically operates the device using one or more their fingertips on a touch sensitive display. Generally, a cursor is not presented on the touch sensitive display and the location of at least one of the fingertips take the place of a cursor on the touch sensitive display. However, unlike use of a mouse and a cursor, which floats above graphical elements (e.g., text, shape, or other graphical objects) presented on a display device, use of fingers and fingertips on a touch sensitive display obscures a user's visibility of graphical elements presented on the touch sensitive display below those fingers and fingertips. This presents a problem when a user attempts to access graphical elements displayed on the touch sensitive display, such as when the user selects, resizes, positions, orients, or connects graphical objects located below the fingers or fingertips.
Various embodiments described herein provide for systems and methods that assist in selection or placement of a graphical object through a graphical user interface (GUI), such as one provided by a touch screen display of a computing device.
According to some embodiments, a system or method detects a first condition for presenting a view window on a touch screen display, determines a first position of a fingertip on the touch screen display, and then presents the view window on the touch screen display at a second position based on the first position (e.g., relative to the first position). The view window may be configured to provide a first view of graphical content presented under the fingertip on the touch screen display at the first position (e.g., while the first fingertip remains at the first position). For instance, where a user selects a graphical object presented on the touch screen display, and the user performs the selection using their fingertip, the view window would present a view of the graphical object as it appears on the touch screen display under the user's fingertip. Subsequently, the system or method may detect movement of the fingertip from the first position to a third position (e.g., in association with selecting, moving, orienting, connecting or resizing a graphical object), and move the view window accordingly from the second position to a fourth position based on the third position (e.g., relative to the third position). Additionally, the system or method may update the view window to provide a second view of graphical content presented under the fingertip on the touch screen display at the third position. In this way, as the fingertip moves across a touch screen display (e.g., fingertip is dragged across the touch screen display), the view window can move accordingly and have the view window track the fingertip. Eventually, the system or method may detect a second condition for removing the view window from the touch screen display. The first view, the second view, or both may provide a magnified view of graphical content presented under the fingertip on the touch screen display.
For some embodiments, the graphical content presented at the first position on the touch screen display, or presented at the second position on the touch screen display, comprises one or more graphical objects. The graphical content can include shapes (e.g., circles, quadrilateral, triangles, etc.), text, lines, stencil objects, an image (e.g., imported), or the like. The system or method may determine the second position based on the first position or may determine the fourth position based on the third position. For instance, the second position may be determined according to a predetermined distance from the first position, or the fourth position may be determined according to a predetermined distance from the third position.
Depending on the embodiment, the first condition may comprise the fingertip being positioned within a predetermined distance from a graphical object presented on the touch screen display (e.g., the fingertip being moved within the predetermined distance from the graphical object). The first condition may comprise selection of a graphical object presented on the touch screen display, where the selection is caused by use of the fingertip on the touch screen display. The first condition may be limited to one or more specific types of graphical object (e.g., end point of a line, anchor points, vertex, etc.). The first condition may relate to selecting, moving, orienting, connecting or resizing a first graphical object presented on the touch screen display, where the selection, movement, orientation, or resizing is caused by use of the fingertip on the touch screen display. For some embodiments, the selecting, moving, orienting, connecting or resizing the first graphical object involves two or more fingertips, which may result in the user's hand obscuring the user's view of the first graphical object during the selecting, moving, orienting, connecting or resizing. The first condition may comprise a first graphical object presented on the touch screen display being positioned within a predetermined distance from a second graphical object presented on the touch screen display. The first condition may comprise receiving from a user an instruction to enable the view window (e.g., user instruction through an element of GUI).
Depending on the embodiment, the second condition may comprise the fingertip being positioned outside a predetermined distance from a graphical object presented on the touch screen display (e.g., fingertip is moved away from the graphical object). The second condition may comprise de-selection of a (currently selected) graphical object presented on the touch screen display. The second condition may comprise removal of contact between the fingertip and the touch screen display. The second condition may comprise a first graphical object presented on the touch screen display being positioned outside a predetermined distance from a second graphical object presented on the touch screen display. The second condition comprises receiving from a user an instruction to disable the view window (e.g., user instruction through an element of GUI).
Various embodiments provide for a computer program product comprising computer instruction codes configured to cause the computer system to perform various operations described herein.
Other features and aspects of various embodiments will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the features of such embodiments.
Various embodiments are described in detail with reference to the following figures. The drawings are provided for purposes of illustration only and merely depict some embodiments. These drawings shall not be considered limiting of the breadth, scope, or applicability of embodiments.
Various embodiments described herein provide for systems and methods that assist in selection or placement of a graphical object through a graphical user interface (GUI), such as one provided by a touch screen display of a computing device.
Some embodiments provide for a system or method that assists with arranging or connecting a graphical object presented on display device, such as a line, vertex (e.g., elbow), rectangle, circle, text box, image, and the like. The system or method may assist in precision selection, movement, orientation, connection, or resizing of a graphical object, which may be presented on a graphical user interface (GUI) canvas. A system or method may, for example, facilitate precision selection of a graphical object's (e.g., a line's) end point or placement of such end point on top of another object's (e.g., a square's) anchor or attachment point. Such a system or method can be beneficial with a touch screen display, where using one or more fingertips to access a graphical object on the display screen display can obstruct a user's view of the graphical object. This can result in the user performing less than accurate selecting, positioning, orienting, connecting or resizing of the graphical object, or make such actions difficult.
A system or method may assist in selecting, moving, orienting, connecting or resizing a graphical object by providing on a touch screen display an augmented view of the touch screen display directly under a user's fingertip. An embodiment may present a view window (or view port) positioned above the user's fingertip. Additionally, the view within the view window may be set to higher magnification, may be the same view size, or may be a reduced view of the area, presented on the touch screen display, below and around the user's fingertip. This may permit the user to see what lies just below a user's fingertip, which in turn can permit the user to accurately select, move, orient, connect, or resize graphical objects. Additionally, this may permit the user to perform an operation with respect to graphical content presented within the view window, moving a selected element or completing a task, such as connecting elements or objects.
For some embodiments, system or methods described herein are implemented with a digital whiteboard system, such as one similar to those described in U.S. Patent Application Publication No. 2011/0246875 and U.S. Patent Application Publication No. 2013/0111380, which are hereby incorporated by reference herein. On graphical user interface (GUI) canvas, a user may connect a line and elbow connectors to various shapes like rectangle and circles. The user may select an end point of a line or elbow connector and drag it toward a graphical object, such as a circle or square. As the user's fingertip approaches the graphical object, the view window may automatically become visible and the user can readily view the end point of the line connector as they control the end point to the graphical object by dragging their finger. As the user's finger moves closer to the graphical object the graphical object's potential connection, or anchor, points become visible (e.g., are enabled). Through the view window, the user can see the relative gap between the end point and the graphical object's anchor point and can easily maneuver the end point on top of the graphical object's anchor point to accurately connect the two. For some embodiments, when the end point reaches a certain distance from the graphical object's anchor point, the end point snaps to the anchor point to form a connection between the line and the graphical object.
The digital whiteboard system 102 and each of the client devices 106 may be implemented using one or more digital devices, which may be similar to the digital devices discussed later with respect to
For instance, through the computer network 104, the client device 106-1 can provide and receive updates to a GUI canvas presented on a touch screen display coupled to the client device 106-1. Through systems or methods described herein, a user may select, move, orient, resize, or connect graphical objects on a GUI canvas, and such actions at the client device 106-1 can cause updates to be sent to the digital whiteboard system 102. Other client devices 106 that have shared access to the GUI canvas may receive updates via the digital whiteboard system 102 or directly from the client device 106-1. For some embodiments, the GUI canvas constitutes a digital whiteboard on which one or more user may add, remove, and modify (e.g., select, move, orient, connect, resize, etc.) graphical objects, including texts, shapes, images, and the like. The GUI canvas presented through the client devices 106 may be configured to provide users with (or provide the user with the experience of) an infinitely-sized work space.
Computing devices may include a mobile phone, a tablet computing device, a laptop, a desktop computer, personal digital assistant, a portable gaming unit, a wired gaming unit, a thin client, a set-top box, a portable multi-media player, or any other type of touch-enabled computing device known to those of skill in the art. Further, the digital whiteboard system 102 may comprise one or more servers, which may be operating on or implemented using one or more cloud-based services (e.g., System-as-a-Service [SaaS], Platform-as-a-Service [PaaS], or Infrastructure-as-a-Service [IaaS]).
The touch screen display 200 can represent any touch-sensitive display device that can receive user input by way of human contact (e.g., hand, fingers, fingertips, etc.) and convey said user input to a computing device to which the touch-sensitive display device is coupled. Depending on the client device 106-1, the touch screen display 200 may be a separate device from the client device 106-1 and coupled to the client device 106-1 through a data interface. For client devices, such tablets, smartphones, and touch-enabled laptops, the touch screen display 200 may be one integrated into the device. Where the client device 106-1 interfaces with a digital white board system, such as the one shown in
The GUI canvas module 202 may be configured to provide or otherwise facilitate presentation of a GUI canvas at the client device 106-1 through the touch screen display 200. The GUI canvas module 202 may be further configured to update graphical content on the GUI canvas based on user input received through the touch screen display 200 or any other human interface device (HID) coupled to the client device 106-1. The GUI canvas module 202 may further update the GUI canvas based on information received from other client devices 106 that have access to the same GUI canvas (e.g., via the digital whiteboard system 102).
According to some embodiment, the graphical object assistance system 204 is configured to augment or otherwise modify the GUI canvas provided by the GUI canvas module 202 (e.g., via an application software interface) such that the GUI canvas presents graphical tools that assist a user with placement, selection, orientation, connection, resizing, or some other operation with respect to one or more graphical objects (e.g., text boxes, shapes, images, lines, etc.) presented on the GUI canvas. For some embodiments, the graphical object assistance system 204 may augment or otherwise modify the GUI canvas to provide a view window on the GUI canvas, where the view window may be configured to provide a real-time view of graphical content currently being presented under a user's fingertip on the touch screen display 200. As also shown in
The detection module 206 may be configured to detect a first condition for invoking (e.g., presenting) a view window on the GUI canvas presented on the touch screen display 200 by the GUI canvas module 202. The detection module 206 may be further configured to detect movement of a fingertip across the touch screen display 200 as a user access the GUI canvas and the graphical objects presented thereon. For instance, the detection module 206 may be configured to detect a user's fingertip moving from a first position to a second position on the touch screen display 200 as the user drags a graphical object on the GUI canvas. Depending on the embodiment, the detection module 206 may be also configured to a second condition for removing (e.g., hiding or moving off screen) the view window currently presented on the GUI canvas on the touch screen display 200.
The positioning module 208 may be configured to determine a position of a fingertip on the touch screen display 200 based on the first condition. For instance, in response to the first condition being met, the positioning module 208 may determine the first position of the fingertip while the fingertip is in contact with the touch screen display 200. For some embodiments, the positioning module 208 may be configured to determine the positioning of a view window when the view window is presented on the GUI canvas on the touch screen display 200. For example, where the positioning module 208 determines that the fingertip is at a first position on the touch screen display 200 when the first condition is met, the positioning module 208 may determine a second position on the touch screen display 200, relative to the first position on the touch screen display 200, to present the view window on the GUI canvas. For some embodiments, the second position is determined to be within a predetermined distance from the first position. Additionally, for some embodiments, the second position is determined according to the first position's proximity to the edge of the visible area of the touch screen display 200.
The view window module 210 may be configured to present a view window on the touch screen display 200 at a position based on position of a fingertip on the touch screen display 200. As described herein, the view window may be configured to provide a view based on the position of the fingertip on the touch screen display 200. For instance, the view window can provide a view of the graphical content presented under or around the fingertip on the touch screen display 200 (e.g., when the fingertip is contact with the touch screen display 200). Depending on the embodiment, the view window may be presented, and may be removed, according to conditions detected by the detection module 206. Additionally, depending on the embodiment, the view provided through the view window may have a larger, similar, or smaller magnification than the actual view of the graphical contents presented under or around the fingertip on the touch screen display 200. For some embodiments, when the fingertip position moves from a first position to a second position, the detection module 206 detects such movement of the fingertip and the positioning module 208 determines the positioning of the fingertip on the touch screen display 200. In some such embodiments, the positioning module 208 can determine positioning of the view window according to the positioning of the fingertip (thereby permitting the view window to follow the fingertip), and the view window module 210 can provide an updated view that reflects the graphical content currently presented on the touch screen display 200 under or around the fingertip at the second position.
The method 300 may start at operation 302, the detection module 206 detects a first condition for invoking (e.g., presenting) a view window on a touch screen display 200. Depending on the embodiment, the first condition may comprise a fingertip, in contact with the touch screen display 200, being positioned within a predetermined distance from a graphical object presented on the touch screen display 200. The first condition may comprise selection of a graphical object presented on the touch screen display 200, where the selection is caused by use of the fingertip on the touch screen display 200. The first condition may be limited to one or more specific types of graphical object, such as an end point of a line, an anchor point, or vertex of a shape. The first condition may relate to selecting, moving, orienting, connecting or resizing of a first graphical object presented on the touch screen display 200, where the selection, movement, orientation, or resizing may be caused by use of the fingertip on the touch screen display 200. The first condition may comprise a first graphical object presented on the touch screen display 200 being positioned within a predetermined distance from a second graphical object presented on the touch screen display 200. Additionally, the first condition may comprise receiving from a user an instruction to enable the view window (e.g., user instruction through an element of GUI).
At operation 304, the positioning module 208 determines a first position of a fingertip on the touch screen display 200 based on the first condition. For example, in response to the first condition being met, the positioning module 208 may determine the first position of the fingertip while the fingertip is in contact with the touch screen display 200.
At operation 306, the view window module 210 presents the view window on the touch screen display 200 at a second position based on the first position, where the view window provides a view based on the first position of the fingertip. For some embodiments, the view window provides a view of graphical content presented under the fingertip, on the touch screen display 200, while the fingertip is at the first position.
At operation 308, the detection module 206 detects movement of the fingertip from the first position to a third position on the touch screen display 200. At operation 310, the view window module 210 moves the view window from the second position to a fourth position on the touch screen display 200 based on the third position. Additionally, at operation 312, the view window module 210 updates the view window to provide an updated view based on the third position. In this way, as the view window moves from the second position to the fourth position, the view provided by the view window can be updated dynamically and may be updated in real-time (e.g., as movement of the fingertip occurs).
At operation 314, the detection module 206 detects a second condition for removing the view window from the touch screen display 200. Depending on the embodiment, the second condition may comprise the fingertip, in contact with the touch screen display 200, being positioned outside a predetermined distance from a graphical object presented on the touch screen display 200. The second condition may comprise de-selection of a graphical object presented on the touch screen display 200 and currently selected. The second condition may comprise the user lifting their finger from touch surface of the touch screen display 200, thereby remove contact between the fingertip and the touch screen display 200. The second condition may comprise a first graphical object presented on the touch screen display 200 being positioned outside a predetermined distance from a second graphical object presented on the touch screen display 200. Additionally, the second condition comprises receiving from a user an instruction to disable the view window (e.g., user instruction through an element of GUI).
At operation 316, the view window module 210 removes (e.g., from visibility) the view window from the touch screen display 200 based on the second condition (e.g., in response to the second condition). Depending on the embodiment, removing the view window from the touch screen display 200 may comprise moving the view window off screen, hiding the visibility of the view window, or the like.
Though the operations of the above method may be depicted and described in a certain order, those skilled in the art will appreciate that the order in which the operations are performed may vary between embodiments, including performing certain operations in parallel. Additionally, those skilled in the art will appreciate that the components described above with respect to the method 300 of the flowchart are merely examples of components that may be used with the method, and for some embodiments other components may also be utilized in some embodiments.
In
As also shown in
In
In
As also shown in
In
In
In some embodiments, a graphical element is provided that functions as a handle for a graphical object (hereafter, a “graphical object handle”). The graphical object handle may appear on a display device, such as a touch screen display, near or adjacent to a graphical object with which it is associated. For instance, the graphical object handle may be positioned offset from the associated graphical object, and may maintain such positioning relative to the associated graphical object when the graphical object handle is used to move the associated graphical.
For some embodiments, a graphical object handle associated with a graphical object appears once the graphical object has been selected by the user. Once the user has selected the object, the graphical object handle may appear on the display device as a new (yet temporary) graphical element. The graphical object handle may be associated to the graphical object such that the graphical object handle is effectively connected to the associated graphical object. This effective connection may exist even when the graphical object handle appears offset from the associated graphical object and even when no graphical representation of the connection is presented on the display device. By way of the association between a graphical object handle and a graphical object, when a user selects and moves the graphical object handle, the associated graphical object may move as well and may move such that the relative positioning between the graphical object handle and the associated graphical object (as shown on the display device) is maintained. In this way, the movement of a graphical object may be synchronized with a graphical object handle that is associated with the graphical object. For example, moving a graphical object handle 1 inch to the right may cause its associated graphical object to correspondingly move to the right 1 inch. This can apply to all directions of motion and may appear to the user as if there is an invisible connection between the graphical object handle and the graphical object to which it is associated. As noted above, providing such a graphical object handle can assist in the precision movement of graphical objects that may otherwise be too small or too close to other graphical objects to move, particularly by way of a touch screen display using a user's finger. Providing such a graphical object handle can also assist in those situations where a user intends for their user input to be interpreted as an action to move a graphical object, rather than it being mistakenly interpreted as an action to resize the graphical object.
In some embodiments, graphical object handles are provided for use with graphical objects on a graphical user interface (GUI) canvas, such as one utilized by a digital whiteboard application operating in association with one or more computing device. The digital whiteboard application may be one configured to provide collaborative access to a digital whiteboard by two or more computing devices. Users of the GUI canvas may create on the digital whiteboard a small graphical object, such as a small circle representing a dot or a point. When the user selects the small circle on GUI canvas, the GUI canvas may display a bounding box, resize points, or both, in association with the small circle, that permit a user to resize the small circle. Additionally, when the user selects the small circle on GUI canvas, the GUI canvas may display a graphical object handle associated with the small circle and that permits the user to move the small circle on the GUI canvas with precision. The graphical object handle may be distinct and separate from the graphical elements that facilitate resizing of the small circle. Accordingly, instead of selecting and attempting to move the dot directly, which may be in accurately interpreted as a resize action (given the small size of the circle), the user can select and move the graphical object handle to accurately position the small circle on the GUI canvas in relation to other graphical objects on the GUI canvas.
The memory system 804 is any memory configured to store data. Some examples of the memory system 804 are storage devices, such as RAM or ROM. The memory system 804 can comprise the RAM cache. In various embodiments, data is stored within the memory system 804. The data within the memory system 804 may be cleared or ultimately transferred to the storage system 806.
The storage system 806 is any storage configured to retrieve and store data. Some examples of the storage system 806 are flash drives, hard drives, optical drives, and/or magnetic tape. In some embodiments, the digital device 800 includes a memory system 804 in the form of RAM and a storage system 806 in the form of flash data. Both the memory system 804 and the storage system 806 comprise computer readable media which may store instructions or programs that are executable by a computer processor including the processor 802.
The communications network interface (com. network interface) 808 can be coupled to a network (e.g., the computer network 104) via the link 816. The communication network interface 808 may support communication over an Ethernet connection, a serial connection, a parallel connection, or an ATA connection, for example. The communication network interface 808 may also support wireless communication (e.g., 802.11a/b/g/n, WiMax). It will be apparent to those skilled in the art that the communication network interface 808 can support many wired and wireless standards.
The optional input/output (I/O) interface 810 is any device that receives input from the user and output data. The optional display interface 812 is any device that is configured to output graphics and data to a display. In one example, the display interface 812 is a graphics adapter.
It will be appreciated by those skilled in the art that the hardware elements of the digital device 800 are not limited to those depicted in
The above-described functions and components can be comprised of instructions that are stored on a storage medium such as a computer readable medium. The instructions can be retrieved and executed by a processor. Some examples of instructions are software, program code, and firmware. Some examples of storage medium are memory devices, tape, disks, integrated circuits, and servers. The instructions are operational when executed by the processor to direct the processor to operate in accord with some embodiments. Those skilled in the art are familiar with instructions, processor(s), and storage medium.
Various embodiments are described herein as examples. It will be apparent to those skilled in the art that various modifications may be made and other embodiments can be used without departing from the broader scope of the invention(s) presented herein. These and other variations upon the exemplary embodiments are intended to be covered by the present invention(s).
The present application claims priority from U.S. Provisional Patent Application Ser. No. 61/836,099, filed Jun. 17, 2013, entitled “Method and Graphical User Interface for Precision Placement and Selection of Objects for Touch-enabled Devices,” which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61836099 | Jun 2013 | US |