The present invention relates to selecting data entities represented by items in a graphical user interface, and in particular to selecting multiple data entities with gestures on a display device to control their subsequent processing.
Many instances may arise in which a user of a computing device wishes to rapidly view and make choices among a group of items depicted in a graphical user interface that represent various data entities. Data entities may comprise specific database entries, project tasks, documents, rebates for a promotional campaign, components used for an assembly, and business transactions for example. Some of the data entries may be selected for further processing.
The user may wish to view available detailed information that is related to the data entities, view which data entities have been previously selected, and/or make new entity selections or deselections. For example, software applications for approving expenses may depict different data entities along a time dimension via displayed graphical tokens. Likewise, a user may wish to efficiently select particular portions of a large set of data entities in an enterprise resource planning application. The subsequent processing may include any tasks that are required as part of business activities.
Displaying an entire dataset may not always be an option. In some instances the dataset is simply too large to allow for convenient display. For example, a display device may be of limited size and resolution, which is often the case with mobile computing devices such as personal digital assistants (PDAs) and smartphones. The user may therefore pan the display to view different portions of the dataset, e.g. in a calendar scenario the user may move along a timeline to display items corresponding to different date ranges. Data entities may be identified by a particular identifying text string or number for compact display.
Unfortunately, past data entity selection tools often required a user to select items only one at a time, which is inconvenient and inefficient. Users want to avoid touching or otherwise selecting many separate entities, as this slows down the selection process. Further, for large datasets with scattered selected entities, it is often difficult for a user to get a complete picture of the selections that have been made. This may prevent the user from easily maintaining context when navigating within a dataset of significant breadth or depth.
As described more fully below, the embodiments disclosed permit improved selection of data entities represented by items in a graphical user interface. Embodiments provide control algorithms to manage a display and user interface within a computer system such as, for example, a mobile computing device. The user interface may display a set of items along with various approval status summary boxes to help a user retain selection context. User gestures and other interactions may control mass selection of represented data entities.
Referring now to
Different projects may be displayed simultaneously, e.g., the “DSG120” 108, “Maxitec R-3300 Professional PC” 110, and “Vaccine DX” 112 exemplary projects are depicted in different vertically separated display regions. Note, although the display regions shown here are each for different projects, it is also possible to display different product groups or categories, or business entities, or entire trade promotions in the same manner. A trade promotion or ad campaign may encompass many companies, product lines, and individual products, so the display shown is merely exemplary. Data entities for each project are depicted by graphical user interface items 102, in this case rounded containers. The items as shown each include an identifier string.
The user interface visually informs users of the selection status of each data entity represented. In this application, unselected items 102 are shown unshaded with italic font used for the identifier string, while selected items 114 are shown shaded with bold non-italic font for the identifier string, but this depiction is merely exemplary. Any item visual distinction attribute may be used, including font, font size, font color, item shading, boundary line thickness, item colors, blinking, scrolling text, and other graphic and/or animation effects as may be known in the art. Items may be compressed in display size for user convenience, and the identifying string may be replaced with an abbreviation, e.g. “T- . . . ” or “T-0000 . . . ” as shown in this figure.
In the past, a user selected individual items in the display by designating an item separately and then indicating that its status should be changed from unselected to selected. Users typically designated items by touching them with a cursor controlled by a tactile device like a mouse or trackball, or by moving a cursor over a particular item. With the advent of touchscreen devices, users may instead employ a stylus or finger to designate a particular item. A mouse click or a double-tap or other interaction with a stylus or finger or other tactile device is often used to indicate the user's intent to select the designated item. If a designated item has already been selected, repetition of the selection action may serve to deselect it, or more generally to toggle the selection status. The meaning of repeated selection actions may be set by a user-controllable option.
Individual item selection may however prove inconvenient and time-consuming for users when there is a large number of items. Further, a particular data entity may be represented by more than one instance of a particular item in a graphical user interface, sometimes across several groups on the screen. This may occur, for example, if a particular component is used in the assembly of different products, or if a particular task is performed multiple times in a project. Users should not have to select the same data entity that appears in multiple representative items. Instead, they should be able to select a data entity from one group and have it automatically selected in other groups if they wish.
Embodiments therefore enable a user to select multiple items representing potentially multiple data entities to be selected at one time. A user may select multiple items in a graphical user interface according to the trajectory of a pointing device, including for example a cursor controlled by a mouse, stylus, or finger. Users may select a single data entity or multiple entities by continuously panning or touching or intersecting or surrounding or otherwise interacting with the display area where representative items are located. The embodiments detect the positions of each touch or pan or other interaction, and compare these against the positions of the representative items that are displayed on the screen.
One example trajectory to denote items to be approved is a “check mark” trajectory 116 as shown in this figure. Using a finger for example, a user may touch the display in a blank area and then traverse at least one item in the display in a downward direction, then change the direction of traversal to upward. The check mark 116 interaction formed, shown via a dashed line in this figure (which may actually be depicted in the display as a temporary path marker), may both designate and select all items that are crossed on both the downward and the upward portions of the trajectory. In this instance, items labeled T-00000276 and T-00000278 (items 118 and 120) are crossed both downwardly and upwardly by the user's finger as shown, and are therefore selected. The item labeled T-59999790 (item 122) is not selected, as it was traversed only in the downward direction. The check mark trajectory may be applied to select a single item or to select multiple items in a user-intuitive manner. Users may also set options to define how various trajectories operate, e.g. in this instance either a downward or an upward trajectory interaction may be accepted, instead of both being required. Also, even if a particular defined trajectory is observed, items within its scope may not all necessarily be selected, for example if particular eligibility criteria specified are not met by an item.
After the selection is done, the user is given a choice of managing the selected data entities list in order to finalize the approval process. For example, a user may touch or otherwise interactively select status indicators 124 for each item in the approval status summary box 104 to approve them immediately. Such an approval may trigger the highlighting or other visual indication of approval of all the items representing the same data entity, even if they are in different groups or projects. A “delete” button 126 may be present for each item in the approval status summary box that has not yet been marked for approval to be removed directly via the approval status summary box. Likewise, an “approve” button 128 may be selected by a user to immediately approve all the items listed in the approval status summary box 104. By aggregating selected items into the approval status summary box 104, user context is more easily maintained, particularly when selected items may span different projects or different regions of a calendar not easily seen simultaneously in the display.
Referring now to
Referring now to
Referring now to
The boundary may be visually depicted in the display, at least temporarily, to help the user identify the limits of the region defined by the trajectory. The region 406 itself may be shaded, at least temporarily. User options may allow trajectory-based selection to accumulate selected items, or to toggle the selection status of items within the boundary. The display will then alter the depiction of selected items to denote their selection, for example by modifying the font and shading settings to those indicating selected items as described. Data entities corresponding to the selected items are added to a cache list used for subsequent processing of those items. Other region shapes, such as triangular, circular, or square for example, may also be used, and may be assigned particular interaction interpretations. Thus, the embodiments allow users to effectively select multiple entities across several groups using geometric shapes or trajectories, which may follow from gestures used naturally for selections.
Referring now to
Each item in the cache table of the approval status summary box may have a corresponding editing link 506 that may be used to trigger editing tools to attach notes to data entities. Such notes may include questions for a second user whose approval may also be required, or provide records of reasons behind an approval decision. The functionality of performing additional checks for funds availability may be incorporated with the approval process through use of the “Check & Approve” button 508. Thus the approval status summary box 502 facilitates quick and easy approval of multiple selected data entries.
Referring now to
Computer system 600 may comprise a mobile computing device such as a personal digital assistant or smartphone for example, along with software products for performing computing tasks. The computer system of
Referring now to
For each point detected during the movement, the method may perform the steps described below. The location(s) where the user has touched or passed with a finger (or stylus point) may be monitored essentially continuously by the processor, and a determination may be made at step 704 as to whether a single location has been designated by such interaction, or if more than one point forms a trajectory. If a single location has been designated by the user in a manner indicating a selection is desired (e.g. a double-click on a mouse or a double-tap with a stylus), execution of the method by the processor proceeds to present the results of the selection, for example by highlighting the selected item in the display.
If more than one point has been designated, the method execution processor proceeds at step 706 to identify the coordinates of each point in the trajectory being traced out by the user. The processor then determines at step 708 to which item and represented data entity the touched locations correspond. For example, if a check-mark trajectory is detected, the processor-driven method may identify which items were traversed in both a downward and an upward interaction direction. Other methods of determining data entity correspondence will be discussed below.
If the data entities are represented in a space with dates, the processor may identify what row and date corresponds to the touched location, and may compare the dates of data entities located in the row against the date of the touched location to determine if the touched location is within the date range of the data entity. If the data entries are not represented in a space with dates, the processor may perform a lookup in a cache table of entity locations to determine to which data entity the touched location belongs. The cache table may be populated when the data entities are initially placed into position.
If the data entity has been selected before, then processor execution may return in step 710 to tracking of designated points. Otherwise, the method may proceed in step 712 to determine if the selected data entity meets other requirements for approval. For example, if the approval status is not “approved” then the status may be set to “to be approved”. If not, then processor execution may return to tracking of designated points.
If the data entity is approved, the processor may add in step 714 the selected data entity to a special cache table of selected data entities. If the data entity appears in multiple groups, there is no need to record the same data entity multiple times. A hash table may be used to implement the special cache table, wherein the key of the cache table is the data entity ID. The value may be any other useful information that is to be preserved.
The processor then proceeds in step 716 to highlight the selected data entity by highlighting the corresponding item in the displayed graphical user interface when the interface is refreshed. The highlighting may include using particular fonts, colors, borders, and other graphical or animation effects as previously described. A user option may allow all item instances corresponding to the selected entity, including those appearing across multiple groups, to be selected and highlighted by the processor, for efficiency. The results of data entity/entities selection may then be presented by the processor in step 718, for example in an approval status summary box.
Details of the operations summarized in the flowchart are now described further. When the method determines data entity correspondence from an interaction trajectory, it may check whether items are surrounded by geometric regions of known shapes, such as rectangles, circles, ovals, triangles, etc. Items located only partially within regions may or may not be selected, according to predetermined user preferences. The method identifies the region boundaries and calculates the area occupied by the region. If the area is larger than the screen area (indicating panning), the method may determine what data entities are not included in the selected area; these data entities are not selected and thus should be excluded from the list of data entities displayed on the screen.
If the area of a region is less than or equal to the screen area, the method may use the region boundaries to determine the rows and columns delimiting the selected area. In a calendar format, columns typically correspond to the dates of the data entities. If beginning and ending dates of a range are determined, then data entities with dates located between those beginning and ending dates may be selected.
If no items are selected for approval, the method may instruct the processor to display a corresponding message to the user. Otherwise, the corresponding data entities may be depicted by the processor in an approval status summary box for easy user management. The user may remove a selected data entity from the list provided by the approval status summary box using the corresponding controls, and the removed data entity will accordingly not be highlighted in the graphical user interface. Similarly, the user may approve all selected entities with one click by using the corresponding controls. The approval status summary box may be dragged and scrolled within the display by the user, in order to view other data entities with the graphical user interface.
The method may further direct the processor to perform additional checks to verify whether a data entry meets particular business criteria, such as having sufficient available funds, before it is approved. An indicator in the approval status summary box may denote the outcome of such additional checks. Further, the method may direct the processor to allow users to view more details about a data entity by navigating to its details using for example an editing button to display and potentially modify all summary data.
As used herein, the terms “a” or “an” shall mean one or more than one. The term “plurality” shall mean two or more than two. The term “another” is defined as a second or more. The terms “including” and/or “having” are open ended (e.g., comprising). Reference throughout this document to “one embodiment”, “certain embodiments”, “an embodiment” or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner on one or more embodiments without limitation. The term “or” as used herein is to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” means “any of the following: A; B; C; A and B; A and C; B and C; A, B and C”. An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
In accordance with the practices of persons skilled in the art of computer programming, embodiments are described below with reference to operations that are performed by a computer system or a like electronic system. Such operations are sometimes referred to as being computer-executed. It will be appreciated that operations that are symbolically represented include the manipulation by a processor, such as a central processing unit, of electrical signals representing data bits and the maintenance of data bits at memory locations, such as in system memory, as well as other processing of signals. The memory locations where data bits are maintained are physical locations that have particular electrical, magnetic, optical, or organic properties corresponding to the data bits.
When implemented in software, the elements of the embodiments are essentially the code segments to perform the necessary tasks. The non-transitory code segments may be stored in a processor readable medium or computer readable medium, which may include any medium that may store or transfer information. Examples of such media include an electronic circuit, a semiconductor memory device, a read-only memory (ROM), a flash memory or other non-volatile memory, a floppy diskette, a CD-ROM, an optical disk, a hard disk, a fiber optic medium, etc. User input may include any combination of a keyboard, mouse, touch screen, voice command input, etc. User input may similarly be used to direct a browser application executing on a user's computing device to one or more network resources, such as web pages, from which computing resources may be accessed.
While particular embodiments of the present invention have been described, it is to be understood that various different modifications within the scope and spirit of the invention are possible. The invention is limited only by the scope of the appended claims.