Digital processing systems and methods for display navigation mini maps

Information

  • Patent Grant
  • 12105948
  • Patent Number
    12,105,948
  • Date Filed
    Wednesday, December 28, 2022
    2 years ago
  • Date Issued
    Tuesday, October 1, 2024
    9 months ago
  • Inventors
  • Original Assignees
  • Examiners
    • Bashore; William L
    • Mercado; Gabriel
    Agents
    • Finnegan, Henderson, Farabow, Garrett & Dunner LLP
Abstract
Systems, methods, and computer-readable media for presenting groups of information on a display are disclosed. System and methods include presenting the groups in the form of a page, each group of information having an associated size, wherein a cumulative size of all groups of information is larger than a dimension of the display; receiving an initial scrolling signal for causing the presented page to scroll; and augmenting the display with a scroll bar divided into sections of differing visual effects. Each section may have a visual effect that is assigned to one group of the plurality of groups of information. A length of each section may be proportional to the associated size of the one group relative to the cumulative size of all the groups, and an order of the visual effects in the scroll bar may correspond to an order of the groups of information in the page.
Description
TECHNICAL FIELD

The present disclosure relates generally to information display method and display navigation. More specifically, this disclosure relates to systems and methods for performing display navigation operations. Consistent with the disclosed embodiments, non-transitory computer-readable storage media may store program instructions, which may be executable by at least one processing device and perform any of the steps and/or methods described herein.


BACKGROUND

Operation of modern enterprises can be complicated and time consuming. In many cases, managing the operation of a single project requires integration of several employees, departments, and other resources of the entity. To manage the challenging operation, project management software applications may be used. Such software applications allow a user to organize, plan, and manage resources by providing project-related information in order to optimize the time and resources spent on each project. It would be useful to improve these software applications to increase operation management efficiency.


Operation of modern enterprises can be complicated and time-consuming. In many cases, managing the operation of a single project requires integration of several employees, departments, and other resources of the entity. To manage the challenging operation, project management software applications may be used. Such software applications allow a user to organize, plan, and manage resources by providing project-related information in order to optimize the time and resources spent on each project.


In addition, project management software applications often depend on the use of massive amounts of shared information, taking the form of documents, files, ledgers, spreadsheets, dashboards, or more generally a page. Most of the time, not all the content on a page can fit within the dimensions of a particular display device. This is especially true when a plurality of users is allowed to add information or when using a mobile display device with particularly limited dimensions. It is therefore essential to be able to quickly find one's way through this vast amount of information to ensure efficient operations. In these situations, many display devices rely on graphical user interface components, such as scroll bars, to allow a user to access different portions of the information.


One limitation of existing scrollbars is that they don't reflect the organization of the content of a page. Scrollbars typically include an elongated track representing the overall size of the page and a cursor anchored on or near the track at the relative position of the portion of the page being displayed. When scrolling occurs, by dragging the cursor, for example, there is no way of knowing which section of the page is being displayed and associating a particular position on the scroll bar with a particular group of information.


SUMMARY

Embodiments consistent with the present disclosure provide systems and methods for performing and facilitating navigation operations. The disclosed embodiments may be implemented using a combination of conventional hardware and software as well as specialized hardware and software.


In an embodiment, a non-transitory computer-readable medium containing instructions that, when executed, cause at least one processor to perform display navigation operations is disclosed. The operations may comprise presenting a plurality of groups of information on a display, in the form of a page, each of the plurality of groups of information having an associated size, wherein a cumulative size of all of the groups of information is larger than at least one dimension of the display; receiving an initial scrolling signal for causing the presented page to scroll on the display; and augmenting the display with a scroll bar divided into sections of differing visual effects, wherein each section has a visual effect corresponding to a visual effect assigned to one group of the plurality of groups of information, a length of each section is proportional to the associated size of the one group relative to the cumulative size of all the groups, and an order of the visual effects in the scroll bar corresponds to an order of the groups of information in the page.


In an embodiment, a method for display navigation is disclosed. The method may comprise: presenting a plurality of groups of information on a display, in the form of a page, each of the plurality of groups of information having an associated size, wherein a cumulative size of all of the groups of information is larger than at least one dimension of the display; receiving an initial scrolling signal for causing the presented page to scroll on the display; and augmenting the display with a scroll bar divided into sections of differing visual effects, wherein each section has a visual effect corresponding to a visual effect assigned to one group of the plurality of groups of information, a length of each section is proportional to the associated size of the one group relative to the cumulative size of all the groups, and an order of the visual effects in the scroll bar corresponds to an order of the groups of information in the page.


In an embodiment, a system for performing display navigation operations on a display having dimensions smaller than a page presented on the display is disclosed. The system may comprise a memory storing instructions and at least one processor that executes the stored instructions to: present a plurality of groups of information on a display, in the form of a page, each of the plurality of groups of information having an associated size, wherein a cumulative size of all of the groups of information is larger than at least one dimension of the display; receive an initial scrolling signal for causing the presented page to scroll on the display; and augment the display with a scroll bar divided into sections of differing colors, wherein each section is colored in a color assigned to one group of the plurality of groups of information, a length of each section is proportional to the associated size of the one group relative to the cumulative size of all the groups, and an order of the colors in the scroll bar corresponds to an order of the groups of information in the presented page.


Other advantages of the invention are set forth in the appended claims which form an integral part hereof. The foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the claims.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various disclosed embodiments. In the drawings:



FIG. 1 is a flowchart of an exemplary process for performing navigation operations, consistent with the disclosed embodiments.



FIG. 2A is an illustration of an exemplary graphical user interface with a scroll bar divided into sections of differing visual effects, consistent with disclosed embodiments.



FIG. 2B is another an illustration of an exemplary graphical user interface with a scroll bar divided into sections of differing visual effects, consistent with disclosed embodiments.



FIG. 2C is another illustration of an exemplary graphical user interface with a scroll bar divided into sections of differing visual effects, consistent with disclosed embodiments.



FIG. 2D is another illustration of an exemplary graphical user interface with a scroll bar divided into sections of differing visual effects, consistent with disclosed embodiments.



FIG. 3 is an illustration of an exemplary graphical user interface with multiple groups of information that are initially assigned a same color, consistent with disclosed embodiments.



FIG. 4 is an illustration of an exemplary graphical user interface with a group of information that is not initially assigned a visual effect, consistent with disclosed embodiments.



FIG. 5 is an illustration of an exemplary graphical user interface with elements from two different groups of information combined to form a new group of information, consistent with disclosed embodiments.



FIG. 6A is an exemplary illustration of a plurality of groups of information presented on a display, showing a representation of a scroll bar divided into sections of differing visual effects, and a variable visual effect cursor, consistent with the disclosed embodiments.



FIG. 6B is an illustration of an exemplary graphical user interface with a scroll bar divided into sections of differing visual effects and a variable visual effect cursor, consistent with disclosed embodiments.



FIG. 6C is another illustration of an exemplary graphical user interface with a scroll bar divided into sections of differing visual effects and a variable visual effect cursor, consistent with disclosed embodiments.



FIG. 7A is an illustration of an exemplary graphical user interface with a pop-up window, consistent with disclosed embodiments.



FIG. 7B is another illustration of an exemplary graphical user interface with a pop-up window, consistent with disclosed embodiments.



FIG. 8 is a diagram of a transition in an exemplary graphical user interface that triggers a haptic signal, consistent with disclosed embodiments.



FIG. 9A is an illustration of an exemplary graphical user interface with a scroll bar section that is smaller than a predetermined threshold, consistent with disclosed embodiments.



FIG. 9B is another illustration of an exemplary graphical user interface with a scroll bar section that is smaller than a predetermined threshold, consistent with disclosed embodiments.



FIG. 9C is another illustration of an exemplary graphical user interface with a scroll bar section that is smaller than a predetermined threshold, consistent with disclosed embodiments.



FIG. 10A is an illustration of an exemplary graphical user interface with a scroll bar section having a modified visual appearance, consistent with disclosed embodiments.



FIG. 10B is another illustration of an exemplary graphical user interface with a scroll bar section having a modified visual appearance, consistent with disclosed embodiments.



FIG. 10C is another illustration of an exemplary graphical user interface with a scroll bar section having a modified visual appearance, consistent with disclosed embodiments.



FIG. 11 is an illustration of an exemplary graphical user interface with a plurality of groups of information presented on a screen, a scroll bar divided into sections of different visual effects, and a cursor, consistent with disclosed embodiments.



FIG. 12 is a block diagram of an exemplary computing device which may be employed in connection with embodiments of the present disclosure.



FIG. 13 is a block diagram of an exemplary computing architecture for collaborative work systems, consistent with embodiments of the present disclosure.





DETAILED DESCRIPTION

Disclosed embodiments provide improved display navigation mechanisms. Disclosed embodiments may generate and display a scroll bar depicting a “mini map” of a displayed page that is larger than the size of the computer display. In response to an interaction with the displayed mini map, a user may be able to quickly scroll through a board or a different section of the mini map.


Disclosed embodiments may be suitable for graphical user interface boards and tablature structure because they may enable users to interact with the scroll bar to quickly jump to different sections of the tablature without needing to scroll all the way through, while also providing an overview of each section of the tablature with groups of information showing relative sizes of indications. Such exemplary embodiments may be helpful in different display devices such as those found on mobile devices, computers, or any other 2D, 3D, AR, VR, or holographic displays. The indications of the groupings of information may be displayed vertically, horizontally, or any other orientation in the scroll bar according to user preference or according to a determined structure of the tablature. The indications of each section may be based on any characteristics of information in the table such as a shared status, person, data type (e.g., group all text column types together, all email columns together, and so on). For example, a scroll of a scroll bar mini map may result in displays of indications that may be relatively sized and colored according to statuses and the number of items sharing a particular status. In response to any interaction, such as a scroll, some disclosed embodiments may include displaying a mini map scroll of all group types found in tablature.


Exemplary embodiments are described with reference to the accompanying drawings. The figures are not necessarily drawn to scale. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It should also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.


In the following description, various working examples are provided for illustrative purposes. However, is to be understood the present disclosure may be practiced without one or more of these details.


Throughout, this disclosure mentions “disclosed embodiments,” which refer to examples of inventive ideas, concepts, and/or manifestations described herein. Many related and unrelated embodiments are described throughout this disclosure. The fact that some “disclosed embodiments” are described as exhibiting a feature or characteristic does not mean that other disclosed embodiments necessarily share that feature or characteristic.


This disclosure presents various mechanisms for collaborative work systems. Such systems may involve software that enables multiple users to work collaboratively. By way of one example, workflow management software may enable various members of a team to cooperate via a common online platform. It is intended that one or more aspects of any mechanism may be combined with one or more aspect of any other mechanisms, and such combinations are within the scope of this disclosure.


This disclosure is constructed to provide a basic understanding of a few exemplary embodiments with the understanding that features of the exemplary embodiments may be combined with other disclosed features or may be incorporated into platforms or embodiments not described herein while still remaining within the scope of this disclosure. For convenience, and form of the word “embodiment” as used herein is intended to refer to a single embodiment or multiple embodiments of the disclosure.


Certain embodiments disclosed herein include devices, systems, and methods for collaborative work systems that may allow a user to interact with information in real time. To avoid repetition, the functionality of some embodiments is described herein solely in connection with a processor or at least one processor. It is to be understood that such exemplary descriptions of functionality apply equally to methods and computer readable media and constitutes a written description of systems, methods, and computer readable media. The underlying platform may allow a user to structure a systems, methods, or computer readable media in many ways using common building blocks, thereby permitting flexibility in constructing a product that suits desired needs. This may be accomplished through the use of boards. A board may be a table configured to contain items (e.g., individual items presented in horizontal rows) defining objects or entities that are managed in the platform (task, project, client, deal, etc.). Unless expressly noted otherwise, the terms “board” and “table” may be considered synonymous for purposes of this disclosure. In some embodiments, a board may contain information beyond which is displayed in a table. Boards may include sub-boards that may have a separate structure from a board. Sub-boards may be tables with sub-items that may be related to the items of a board. Columns intersecting with rows of items may together define cells in which data associated with each item may be maintained. Each column may have a heading or label defining an associated data type. When used herein in combination with a column, a row may be presented horizontally and a column vertically. However, in the broader generic sense as used herein, the term “row” may refer to one or more of a horizontal and/or a vertical presentation. A table or tablature as used herein, refers to data presented in horizontal and vertical rows, (e.g., horizontal rows and vertical columns) defining cells in which data is presented. Tablature may refer to any structure for presenting data in an organized manner, as previously discussed. such as cells presented in horizontal rows and vertical columns, vertical rows and horizontal columns, a tree data structure, a web chart, or any other structured representation, as explained throughout this disclosure. A cell may refer to a unit of information contained in the tablature defined by the structure of the tablature. For example, a cell may be defined as an intersection between a horizontal row with a vertical column in a tablature having rows and columns. A cell may also be defined as an intersection between a horizontal and a vertical row, or as an intersection between a horizontal and a vertical column. As a further example, a cell may be defined as a node on a web chart or a node on a tree data structure. As would be appreciated by a skilled artisan, however, the disclosed embodiments are not limited to any specific structure, but rather may be practiced in conjunction with any desired organizational arrangement. In addition, tablature may include any type of information, depending on intended use. When used in conjunction with a workflow management application, the tablature may include any information associated with one or more tasks, such as one or more status values, projects, countries, persons, teams, progress statuses, a combination thereof, or any other information related to a task.


While a table view may be one way to present and manage the data contained on a board, a table's or board's data may be presented in different ways. For example, in some embodiments, dashboards may be utilized to present or summarize data derived from one or more boards. A dashboard may be a non-table form of presenting data, using, for example, static or dynamic graphical representations. A dashboard may also include multiple non-table forms of presenting data. As discussed later in greater detail, such representations may include various forms of graphs or graphics. In some instances, dashboards (which may also be referred to more generically as “widgets”) may include tablature. Software links may interconnect one or more boards with one or more dashboards thereby enabling the dashboards to reflect data presented on the boards. This may allow, for example, data from multiple boards to be displayed and/or managed from a common location. These widgets may provide visualizations that allow a user to update data derived from one or more boards.


Boards (or the data associated with boards) may be stored in a local memory on a user device or may be stored in a local network repository. Boards may also be stored in a remote repository and may be accessed through a network. In some instances, permissions may be set to limit board access to the board's “owner” while in other embodiments a user's board may be accessed by other users through any of the networks described in this disclosure. When one user makes a change in a board, that change may be updated to the board stored in a memory or repository and may be pushed to the other user devices that access that same board. These changes may be made to cells, items, columns, boards, dashboard views, logical rules, or any other data associated with the boards. Similarly, when cells are tied together or are mirrored across multiple boards, a change in one board may cause a cascading change in the tied or mirrored boards or dashboards of the same or other owners.


Boards and widgets may be part of a platform that may enable users to interact with information in real time in collaborative work systems involving electronic collaborative word processing documents. Electronic collaborative word processing documents (and other variations of the term) as used herein are not limited to only digital files for word processing, but may include any other processing document such as presentation slides, tables, databases, graphics, sound files, video files or any other digital document or file. Electronic collaborative word processing documents may include any digital file that may provide for input, editing, formatting, display, and/or output of text, graphics, widgets, objects, tables, links, animations, dynamically updated elements, or any other data object that may be used in conjunction with the digital file. Any information stored on or displayed from an electronic collaborative word processing document may be organized into blocks. A block may include any organizational unit of information in a digital file, such as a single text character, word, sentence, paragraph, page, graphic, or any combination thereof. Blocks may include static or dynamic information, and may be linked to other sources of data for dynamic updates. Blocks may be automatically organized by the system, or may be manually selected by a user according to preference. In one embodiment, a user may select a segment of any information in an electronic word processing document and assign it as a particular block for input, editing, formatting, or any other further configuration.


An electronic collaborative word processing document may be stored in one or more repositories connected to a network accessible by one or more users through their computing devices. In one embodiment, one or more users may simultaneously edit an electronic collaborative word processing document. The one or more users may access the electronic collaborative word processing document through one or more user devices connected to a network. User access to an electronic collaborative word processing document may be managed through permission settings set by an author of the electronic collaborative word processing document. An electronic collaborative word processing document may include graphical user interface elements enabled to support the input, display, and management of multiple edits made by multiple users operating simultaneously within the same document.


Various embodiments are described herein with reference to a system, method, device, or computer readable medium. It is intended that the disclosure of one is a disclosure of all. For example, it is to be understood that disclosure of a computer readable medium described herein also constitutes a disclosure of methods implemented by the computer readable medium, and systems and devices for implementing those methods, via for example, at least one processor. It is to be understood that this form of disclosure is for ease of discussion only, and one or more aspects of one embodiment herein may be combined with one or more aspects of other embodiments herein, within the intended scope of this disclosure.


Embodiments described herein may refer to a non-transitory computer readable medium containing instructions that when executed by at least one processor, cause the at least one processor to perform a method. Non-transitory computer readable mediums may be any medium capable of storing data in any memory in a way that may be read by any computing device with a processor to carry out methods or any other instructions stored in the memory. The non-transitory computer readable medium may be implemented as hardware, firmware, software, or any combination thereof. Moreover, the software may preferably be implemented as an application program tangibly embodied on a program storage unit or computer readable medium consisting of parts, or of certain devices and/or a combination of devices. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine may be implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described in this disclosure may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such a computer or processor is explicitly shown. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit. Furthermore, a non-transitory computer readable medium may be any computer readable medium except for a transitory propagating signal.


As used herein, a non-transitory computer-readable storage medium refers to any type of physical memory on which information or data readable by at least one processor can be stored. Examples of memory include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, any other optical data storage medium, any physical medium with patterns of holes, markers, or other readable elements, a PROM, an EPROM, a FLASH-EPROM or any other flash memory, NVRAM, a cache, a register, any other memory chip or cartridge, and networked versions of the same. The terms “memory” and “computer-readable storage medium” may refer to multiple structures, such as a plurality of memories or computer-readable storage mediums located within an input unit or at a remote location. Additionally, one or more computer-readable storage mediums can be utilized in implementing a computer-implemented method. The memory may include one or more separate storage devices collocated or disbursed, capable of storing data structures, instructions, or any other data. The memory may further include a memory portion containing instructions for the processor to execute. The memory may also be used as a working scratch pad for the processors or as a temporary storage Accordingly, the term computer-readable storage medium should be understood to include tangible items and exclude carrier waves and transient signals.


Some embodiments may involve at least one processor. Consistent with disclosed embodiments, “at least one processor” may constitute any physical device or group of devices having electric circuitry that performs a logic operation on an input or inputs. For example, the at least one processor may include one or more integrated circuits (IC), including application-specific integrated circuit (ASIC), microchips, microcontrollers, microprocessors, all or part of a central processing unit (CPU), graphics processing unit (GPU), digital signal processor (DSP), field-programmable gate array (FPGA), server, virtual server, or other circuits suitable for executing instructions or performing logic operations. The instructions executed by at least one processor may, for example, be pre-loaded into a memory integrated with or embedded into the controller or may be stored in a separate memory. The memory may include a Random Access Memory (RAM), a Read-Only Memory (ROM), a hard disk, an optical disk, a magnetic medium, a flash memory, other permanent, fixed, or volatile memory, or any other mechanism capable of storing instructions. In some embodiments, the at least one processor may include more than one processor. Each processor may have a similar construction or the processors may be of differing constructions that are electrically connected or disconnected from each other. For example, the processors may be separate circuits or integrated in a single circuit. When more than one processor is used, the processors may be configured to operate independently or collaboratively, and may be co-located or located remotely from each other. The processors may be coupled electrically, magnetically, optically, acoustically, mechanically or by other means that permit them to interact.


Consistent with the present disclosure, disclosed embodiments may involve a network. A network may constitute any type of physical or wireless computer networking arrangement used to exchange data. For example, a network may be the Internet, a private data network, a virtual private network using a public network, a Wi-Fi network, a LAN or WAN network, a combination of one or more of the forgoing, and/or other suitable connections that may enable information exchange among various components of the system. In some embodiments, a network may include one or more physical links used to exchange data, such as Ethernet, coaxial cables, twisted pair cables, fiber optics, or any other suitable physical medium for exchanging data. A network may also include a public switched telephone network (“PSTN”) and/or a wireless cellular network. A network may be a secured network or unsecured network. In other embodiments, one or more components of the system may communicate directly through a dedicated communication network. Direct communications may use any suitable technologies, including, for example, BLUETOOTH™, BLUETOOTH LE™ (BLE), Wi-Fi, near field communications (NFC), or other suitable communication methods that provide a medium for exchanging data and/or information between separate entities.


Certain embodiments disclosed herein may also include a computing device for generating features for work collaborative systems, the computing device may include processing circuitry communicatively connected to a network interface and to a memory, wherein the memory contains instructions that, when executed by the processing circuitry, configure the computing device to receive from a user device associated with a user account instruction to generate a new column of a single data type for a first data structure, wherein the first data structure may be a column oriented data structure, and store, based on the instructions, the new column within the column-oriented data structure repository, wherein the column-oriented data structure repository may be accessible and may be displayed as a display feature to the user and at least a second user account. The computing devices may be devices such as mobile devices, desktops, laptops, tablets, or any other devices capable of processing data. Such computing devices may include a display such as an LED display, augmented reality (AR), virtual reality (VR) display.


Disclosed embodiments may include and/or access a data structure. A data structure consistent with the present disclosure may include any collection of data values and relationships among them. The data may be stored linearly, horizontally, hierarchically, relationally, non-relationally, uni-dimensionally, multidimensionally, operationally, in an ordered manner, in an unordered manner, in an object-oriented manner, in a centralized manner, in a decentralized manner, in a distributed manner, in a custom manner, or in any manner enabling data access. By way of non-limiting examples, data structures may include an array, an associative array, a linked list, a binary tree, a balanced tree, a heap, a stack, a queue, a set, a hash table, a record, a tagged union, ER model, and a graph. For example, a data structure may include an XML database, an RDBMS database, an SQL database or NoSQL alternatives for data storage/search such as, for example, MongoDB, Redis, Couchbase, Datastax Enterprise Graph, Elastic Search, Splunk, Solr, Cassandra, Amazon DynamoDB, Scylla, HBase, and Neo4J. A data structure may be a component of the disclosed system or a remote computing component (e.g., a cloud-based data structure). Data in the data structure may be stored in contiguous or non-contiguous memory. Moreover, a data structure, as used herein, does not require information to be co-located. It may be distributed across multiple servers, for example, that may be owned or operated by the same or different entities. Thus, the term “data structure” as used herein in the singular is inclusive of plural data structures.


Certain embodiments disclosed herein may include a processor configured to perform methods that may include triggering an action in response to an input. The input may be from a user action or from a change of information contained in a user's table or board, in another table, across multiple tables, across multiple user devices, or from third-party applications. Triggering may be caused manually, such as through a user action, or may be caused automatically, such as through a logical rule, logical combination rule, or logical templates associated with a board. For example, a trigger may include an input of a data item that is recognized by at least one processor that brings about another action.


In some embodiments, the methods including triggering may cause an alteration of data and may also cause an alteration of display of data contained in a board or in memory. An alteration of data may include a recalculation of data, the addition of data, the subtraction of data, or a rearrangement of information. Further, triggering may also cause a communication to be sent to a user, other individuals, or groups of individuals. The communication may be a notification within the system or may be a notification outside of the system through a contact address such as by email, phone call, text message, video conferencing, or any other third-party communication application.


Some embodiments include one or more of automations, logical rules, logical sentence structures and logical (sentence structure) templates. While these terms are described herein in differing contexts, in a broadest sense, in each instance an automation may include a process that responds to a trigger or condition to produce an outcome; a logical rule may underly the automation in order to implement the automation via a set of instructions; a logical sentence structure is one way for a user to define an automation; and a logical template/logical sentence structure template may be a fill-in-the-blank tool used to construct a logical sentence structure. While all automations may have an underlying logical rule, all automations need not implement that rule through a logical sentence structure. Any other manner of defining a process that respond to a trigger or condition to produce an outcome may be used to construct an automation.


Other terms used throughout this disclosure in differing exemplary contexts may generally share the following common definitions.


In some embodiments, machine learning algorithms (also referred to as machine learning models or artificial intelligence in the present disclosure) may be trained using training examples, for example in the cases described below. Some non-limiting examples of such machine learning algorithms may include classification algorithms, data regressions algorithms, image segmentation algorithms, visual detection algorithms (such as object detectors, face detectors, person detectors, motion detectors, edge detectors, etc.), visual recognition algorithms (such as face recognition, person recognition, object recognition, etc.), speech recognition algorithms, mathematical embedding algorithms, natural language processing algorithms, support vector machines, random forests, nearest neighbors algorithms, deep learning algorithms, artificial neural network algorithms, convolutional neural network algorithms, recursive neural network algorithms, linear machine learning models, non-linear machine learning models, ensemble algorithms, and so forth. For example, a trained machine learning algorithm may comprise an inference model, such as a predictive model, a classification model, a regression model, a clustering model, a segmentation model, an artificial neural network (such as a deep neural network, a convolutional neural network, a recursive neural network, etc.), a random forest, a support vector machine, and so forth. In some examples, the training examples may include example inputs together with the desired outputs corresponding to the example inputs. Further, in some examples, training machine learning algorithms using the training examples may generate a trained machine learning algorithm, and the trained machine learning algorithm may be used to estimate outputs for inputs not included in the training examples. In some examples, engineers, scientists, processes and machines that train machine learning algorithms may further use validation examples and/or test examples. For example, validation examples and/or test examples may include example inputs together with the desired outputs corresponding to the example inputs, a trained machine learning algorithm and/or an intermediately trained machine learning algorithm may be used to estimate outputs for the example inputs of the validation examples and/or test examples, the estimated outputs may be compared to the corresponding desired outputs, and the trained machine learning algorithm and/or the intermediately trained machine learning algorithm may be evaluated based on a result of the comparison. In some examples, a machine learning algorithm may have parameters and hyper parameters, where the hyper parameters are set manually by a person or automatically by a process external to the machine learning algorithm (such as a hyper parameter search algorithm), and the parameters of the machine learning algorithm are set by the machine learning algorithm according to the training examples. In some implementations, the hyper-parameters are set according to the training examples and the validation examples, and the parameters are set according to the training examples and the selected hyper-parameters.



FIG. 1 is a schematic diagram of an exemplary process (100) for performing navigation operations that may be executed by at least one processor. Process 100 is used for explanatory purposes and is not intended to be limiting. The process may be implemented using one or more components of computing device 1200 (discussed in FIG. 12) or user device 1320 of computing architecture 1300 (discussed in FIG. 13). As shown in FIG. 1, process 100 may include a step 102 of presenting a plurality of groups of information on a display in the form of a page. In some embodiments, the page may be a complete full-screen image, or a representation of a screenful of information. The page may include text and elements that are provided on a display, subject to the dimensions of the display such as the display width and/or height. In some embodiments, the page may be a screen of a computer application executed by at least on processor of a computing device such as computing device 1200. In some embodiments, the page may be a webpage or website provided for presentation on a display of the computing device 1200.


Above and throughout this disclosure, a “group of information” may refer to any type of data associated with a visual representation, such as text, images, numbers, lists, tables, diagrams, charts, graphics, drawings, or other types of graphical user interface components. Each group of information may have an associated size, such as a size that a group of information may have when presented on a display. In some embodiments, a “size” may be associated with a dimension of length, such as standard or metric units of length, or a number of pixels shown of the display. Multiple groups of information, considered together, may have a cumulative size corresponding to the total size of all the groups of information. A cumulative size of the plurality of groups of information may be larger than at least one dimension of the display.


As used in this disclosure, the term “display” may refer either to any physical device capable of providing a visual presentation of data or directly to a visual presentation of data. Examples of physical devices acting as displays include computer screens, smartphone screens, tablet screens, smartwatch screens, laptop screens, video walls, projectors, head-mounted displays or virtual reality headsets. Additionally, displays may utilize graphical user interfaces (GUIs) to permit user interaction with data. In many GUIs, a visual presentation of data is often provided using a graphical user interface component known as a window, or a page. Any visual presentation of a device or display may be characterized by dimensions, these dimensions are usually limited, so most of the time, any type of information cannot be completely presented by a display device or fit a presented page. For example, when a plurality of groups of information is arranged side by side, a cumulative size of the plurality of groups of information may be larger than at least one dimension of the display.


In step 104, the processor may receive an initial scrolling signal for causing the presented page to scroll on the display. In some embodiments, the initial scrolling signal may be received as a result of manipulating various controls associated with the scroll bar, or a particular movement made by the user that is interpreted by the processor as a command to scroll. In some embodiments, a scroll signal can be the result of moving a cursor docked on or near a scroll bar presented on the display, clicking on increment/decrement control interface buttons, detecting a touch motion or gesture associated with manipulating or attempting to move the presented page, or performing a swipe motion relative to the display. In the context of this description, an initial scrolling signal refers to a scrolling signal that may occur in advance of scrolling the page, and in advance of a later scrolling signal. In some embodiments, an initial scrolling signal may scroll the presented page over a distance less than, greater than, or equal to one of the dimensions of the display.


In some embodiments, the presented page may represent less than an entire page. In some embodiments, the entire page may include all the groups from the plurality of groups of information. Accordingly, the entire page may include one or more groups of information that are not displayed on the presented page. In some embodiments, the presented page may include at least a portion of all of the groups of information, but may not include a portion of a large group that extends beyond the presented page. In such embodiments, the entire page may include all of the groups of information in their entirety. Thus, the entire page may include the presented page and, in addition, one or more groups of information or portions of groups of information that are not fully displayed on the presented page. For example, the entire page may extend beyond the dimensions of the display, such that the scroll bar facilitates navigation to different parts of the presented page and also to portions of the entire page that are not yet presented. Such portions of the entire page may be presented during a scrolling action toward the portions. In some embodiments, an interaction with a particular location on the scroll bar may scroll the page to a corresponding particular location in the entire page. An interaction may refer to any type of user input related to a scrollbar component. For example, if a user is clicking on a particular location of the scroll bar, the page may be scrolled up to the corresponding particular location. Other examples of interactions with a particular location on the scroll bar may include persistently clicking on the particular location, touching the scroll bar at a particular location presented on a touchscreen, persistently touching the scroll bar at a particular location, performing a gesture such as double tapping a particular location on the scroll bar, or repeatedly tapping a particular location on the scroll bar.


In step 106, in response to receiving the initial scrolling signal, the display may be augmented with a scroll bar divided into sections of differing visual effects. The visual effects may serve to distinguish and differentiate between each group of information among the plurality of group information, thereby facilitating user navigation operations and increasing the efficiency and accuracy of navigation operations. In some embodiments, each portion of the scroll bar may be directly associated with each group of information using different visual effects.


As discussed herein, visual effects may refer to any type of enhancement or characteristic of a visual representation that distinguishes one group of information, or one section of the scroll bar, from another. In one embodiment, the differing visual effects may include a unique color associated with each group of the plurality of groups of information. For example, each of a first group of information can be associated with a section having a first color, a second group with a second color, and so on, such that the displayed colors all differ from each other. Different colors may differ by shade, hue, tone, brightness, coloration, or other characteristics that cause one color to differ visually from another. In some embodiments, the differing visual effects may include a unique combination of a color and a texture associated with each group of the plurality of groups of information. A texture may include a pattern or other visual appearance that may be combined with a color to further differentiate appearances of different sections in a scroll bar. Non-limiting examples of textures can include stippling patterns of various densities, or cross-hatching patterns. In such embodiments, a first group of information can be associated with a first combination of color and texture, a second group with the second combination of color and texture, and so on, so that the displayed combinations are all different from each other, even if multiple sections have the same or similar colors. In this situation, two or more groups of information can be associated with the same color but with a different texture, and conversely, two or more groups of information can be associated with the same texture but with different colors. Accordingly, each section of the scroll bar may have a visual effect corresponding to a visual effect assigned to one group of the plurality of groups of information, shown in FIG. 1 in substep 110. In some embodiments, the visual effects may include a combination of two or more colors, gradients, patterns, shadows or any combination thereof.


In some embodiments, a “scroll bar” may be a graphical user interface component or element that provides a visual representation of the groups of information in the entire page. In some embodiments, a scroll bar is an interactive horizontal or vertical bar at the side or bottom of the display, for moving around a page on the display. The scroll bar may include part or all of a mini map of all of the groups of information, using different visual effects to readily distinguish between each group of information. Graphical user interface components such as the scroll bar may include interactive capabilities, and permit a user to access a particular portion of the page via selection of a section in the scroll bar associated with a group of information corresponding to the particular portion of the page. The scroll bar may therefore permit a user to scroll and navigate in the page in an indicated direction based on characteristics of the received a scrolling signal.


In some embodiments, sections of the scroll bar may have attributes determined based on the groups of information in the entire page. For example, in a displayed scroll bar, a length of each section may be set proportional to an associated size of the one group relative to the cumulative size of all the groups, as shown in substep 108 of FIG. 1. Accordingly, each section may be displayed in a manner that conveys relative sizes of each of the plurality of groups of information. Some embodiments may include determining a total length or size of a page in a given dimension, and determining a ratio, portion, percentage, or fraction of the size occupied by each one of the groups of information, in order to generate a displayed scroll bar with such proportional section lengths.


In some embodiments, an order of the visual effects in the scroll bar may be set to correspond to an order of the groups of information in the page, as shown in substep 112. As a result, a user may scroll between groups of information simply by moving up or down along the scroll bar, and may quickly and efficiently navigate between groups of information in the page. In some embodiments, all of the sections may be presented on the display while the scroll bar is presented. In other embodiments, fewer than all of the sections may be presented on the display. The displayed sections may change depending on various factors such as a size of the page, a current position in the page, and one or more size constraints or capabilities of the display.


In some embodiments, the scroll bar may be configured to disappear from the display after a predetermined time, and after the initial scrolling signal is completed. For example, once the initial scrolling signal is received and over, the scroll bar may disappear from the display after 1, 2, or 5 seconds or any suitable time.



FIGS. 2A-2D illustrate examples of a plurality of groups of information presented on a display, showing a representation of a scroll bar divided into sections of differing visual effects consistent with the disclosed embodiments. A display 208 of a computing device is shown as a broken-line box to illustrate the dimensions and bounds of the display. As shown, a scroll bar 210 is located on the right side of the display, with sections (212a-d, 214a-d, 216a-d) of differing visual effects. A plurality of groups of information (202a-d, 204a-d, 206a-d) are included in a page presented on the display.


As shown in FIGS. 2A-2D, the dimensions of display 208 do not permit the display of the entire page 218 having groups of information (202a-d, 204a-d, 206a-d). As shown, a cumulative size of the plurality of groups of information (Group 1, Group 2, and Group 3) is greater than one of the dimensions of the display (e.g., the vertical dimension), and thus not all the groups of information are displayed simultaneously. In the examples in FIGS. 2A-2D, the first group of information (202a-d) is fully displayed, the second group of information (204a-d) is partially displayed and cropped, and the third group of information is not presented on the display due to display size limitations.


In the examples shown, scroll bar 210 is divided into a plurality of sections (212a-d, 214a-d, 216a-d), and each section has a visual effect corresponding to the visual effect assigned to the group of the plurality of groups of information. The sections of scroll bar 210 are shown with lengths that are proportional to the associated sizes of each of the groups of information. For example, as shown in FIGS. 2A, section 212b corresponds to “Group 1” and group of information 202b, and section 214b corresponds to “Group 2” and group of information 204b. Group 2 has a larger size relative to Group 1 in the page, and therefore section 214b has a proportionally larger size relative to section 212b.



FIGS. 2A-2D also show a scroll bar 210 having sections in an order of the visual effects in the corresponding groups of information in the page. For example, as shown in FIG. 2A, sections 212a and 214a correspond to Group 1 and Group 2, respectively. Visual effects assigned to Group 1 and Group 2 are shown in scroll bar 210 in an order that those groups of information appear in the page. In some embodiments, visual effects appearing in scroll bar 210 may not appear in the respective groups of information. In such embodiments, visual effects may be assigned to different groups of information to distinguish groups in the scroll bar, even if the same visual effects do not appear elsewhere in the page.


Although FIGS. 2A-2D illustrate three different groups of information, with different associated sizes, it is to be appreciated that any number of groups of information may be used. The benefits of the disclosed embodiments are realized when the number of groups of information is greater than one. In addition, two different groups of information may have the same associated size, or may have different sizes. It is also to be appreciated that whilst FIGS. 2A-2D depict a plurality of groups of information whose cumulative size is greater than the vertical dimension of the display, the cumulative size may be greater than either or both of the horizontal and vertical dimensions of the display. In some embodiments, the cumulative size may be greater than both dimensions of the display. In such embodiments, a standard scroll bar may be provided to perform navigation operations along a direction other than the direction of scroll bar 210 with sections of different visual effects. For example, if the presented page includes a plurality of groups that are larger in height and width than the display, scroll bar 210 may be displayed along a vertical axis, and a standard scroll bar may be displayed along a horizontal axis. In some embodiments, scroll bar 210 having multiple sections and visual effects may be displayed along a horizontal axis. Further, while FIGS. 2A-2D illustrates the scroll bar 210 located on the right-hand side of the display, scroll bar 210 may be located in any suitable portion of the display. The potential configurations are not limited by the examples in this disclosure.


As discussed above, a length of each section may be proportional to an associated size of the one group relative to the cumulative size of all the groups. Furthermore, a one of the two dimensions (vertical or horizontal) of the scroll bar 210 may be proportional to the overall size of the page corresponding to the cumulative size of all the groups of information. In the context of this description, this dimension will be referred to as the “primary dimension” and the other dimension as the “secondary dimension.” In some embodiments, the primary dimension may be comparable to one of the dimensions of the display. For example, in FIGS. 2A-2D, the vertical dimension of scroll bar 210 may be correspond to the vertical dimension of display 208. The secondary dimension may not be proportional to characteristics of the plurality of groups of information. In some embodiments, the secondary dimension may be selected to meet readability, visual comfort, or aesthetic criteria. In some other embodiments, the secondary dimension may be a result of a user input.


Although FIGS. 2A-2D represent examples of a new scroll bar design having sections with different visual effects, it should be appreciated that any other design implementation that covers the functions of a scroll bar and provides a visual representation with parts divided into sections of different visual effects can be used. For example, scroll bar 210 may be appear similar to standard scroll bars, having a scroll bar “shaft” or “track,” in which a scroll “thumb” or scroll “box” moves. The scroll bar track may be divided into sections of different visual effects, such that the track has an outline, background, or any other visual feature that shows different visual effects consistent with the disclosed embodiments. In some embodiments, the scroll box may have a fixed appearance, or may include one or more varying visual effects consistent with disclosed embodiments. In some embodiments, scroll bar 210 may include one or more navigation buttons at one or more ends of the scroll bar.


Disclosed embodiments may provide different ways of identifying a visual effect assigned to a group of information. In some embodiments, a visual effect may be displayed in conjunction with a group of information such that when scrolling through the section corresponding to the group of information, the visual effect assigned to the group of information is always visible. For example, as shown in FIG. 2A a visual effects-coded border may be displayed next to the group of information, such that the differing visual effects may include a unique color associated with each group of the plurality of groups of information. As another example, as shown in FIG. 2D the differing visual effects may correspond to a unique combination of a color and a texture associated with each group of the plurality of groups of information. In some embodiments, as shown in FIG. 2B, a visual effect-coded heading may be displayed at the top of the display or above an associated corresponding group of information. The heading may be displayed until the corresponding group of information is no longer presented due to a scroll signal associated with navigating to one or more other different groups of information that occupy the presented page. In some embodiments, such as the example illustrated in FIG. 2C, a visual effect-coded background may be presented behind a corresponding group of information, such that the visual effects correspond to a unique color associated with each group of the plurality of groups of information. The examples shown in FIGS. 2A-2D are non-limiting and other visual designs that include a visual effect displayed in conjunction with a group of information may be employed without departing from the scope of the disclosed embodiments.


In some embodiments, the differing visual effects may include a unique combination of a color and a texture associated with each group of the plurality of groups of information. In some embodiments, in response to a determination that a same color is assigned to more than one of the plurality of groups of information, a different texture may be assigned to each group of the plurality of groups of information having the same color. In some embodiments, textures may be assigned only one or more of the groups of information that are assigned the same color. In some embodiments, textures may be assigned to all groups of information associated with the same color. Such a situation may arise, for example, if a user manually assigns the same color to multiple groups, or as another example, if the number of groups of information exceeds a number of available, distinguishable colors.



FIG. 3 is an exemplary illustration of a plurality of groups of information, in which two groups of information are initially assigned with a same color, consistent with disclosed embodiments. As shown in FIG. 3, a first group of information 302, second group of information 304, and third group of information 306 are included in a page. First group of information 302 and third group of information 306 may be assigned the same color. The device processor may determine that more than one group of information is assigned a same color, and responsive to this determination a different texture may be assigned to each group of information that is assigned to the same color. In some embodiments, this determination may be made prior to an initial scrolling signal, or upon receipt of the initial scrolling signal. Although FIG. 3 illustrates a situation in which two groups of information have been assigned with the same color, the process of assigning a different texture to each group of information may be generalized to any number of groups of information that have been assigned with a same color. Additionally or alternatively, in some embodiments, in response to a determination that a same color is assigned to more than one of the plurality of groups of information, one or more of the colors assigned to each group of the plurality of groups of information may be changed, so that all assigned colors are different from each other.


Disclosed embodiments may provide different ways of assigning a visual effect to a group of information. In some embodiments, a visual effect may be assigned to a group of information in response to a user input. For example, at any time during a process of creating or editing a group of information, the processor may receive inputs from a user associated with choose a visual effect and assign it to the group of information. In some embodiments, the processor may receive an input from a user associated with modifying a visual effect assigned to the group of information at any time during a process of editing a group of information. In some embodiments, one or more processors may automatically assign a visual effect to a group of information based on information included in the group of information, such as based on a type of the information. For example, a first visual effect may be assigned to a first group of information corresponding to images, and a different second visual effect may be assigned to a second group of information corresponding to text. In some embodiments, a visual effect may be automatically assigned to a group of information when the group of information is created. Further, in some embodiments, each of the plurality of groups of information may have a predetermined visual effect. In some embodiments, all groups of information may be assigned predetermined visual effects, regardless of how the visual effect is assigned to the group of information. The assignment may occur, in some embodiments, prior to receiving the initial scroll signal.


In some embodiments, at least one processor may be configured to assign a random visual effect to at least one of the plurality of groups of information, in response to a determination that one of the plurality of groups of information was not assigned with a visual effect. In some embodiments, such a determination may be made prior to the scroll, or prior to receiving the initial scrolling signal. A “random” assignment may be made in a pseudorandom manner by employing a computerized randomizer or a random number generator that selects a visual effect in a manner where there is no visual effect more likely to be selected over other visual effects at a given time. FIG. 4 is an illustration of a plurality of groups of information having one group of information that is not initially assigned a visual effect, in accordance with the disclosed embodiments. As shown in FIG. 4, there are first, second, and third groups of information 402, 404, and 406, and third group of information 406 is not assigned a visual effect. At least one processor may determine, prior to an initial scrolling signal. upon receipt of the initial scrolling signal, or prior to the scroll, that third group 406 is not assigned a visual effect. Responsive to this determination, the at least one processor may assign a random visual effect to third group 406. In the example shown, a combination of color and texture is assigned to third group 406. In some embodiments, such an assignment may result in a visual effect being applied to a section of a scroll bar. Although FIG. 4 illustrates a situation in which only one group of information has not been assigned a visual effect, the process of assigning a random visual effect to one group of information may be generalized to any number of groups of information that have not been assigned a visual effect.


In some embodiments, assigning a random visual effect to one of the plurality of groups of information may include assigning a random visual effect that is different from all visual effects already assigned to the other groups of information of the plurality of groups of information. For example, if the visual effects correspond to a unique color associated with each group of the plurality of groups of information, a random color may be selected from all available colors on the display, minus the colors already assigned to the other groups of information. Additionally or alternatively, in some embodiments, assigning a random visual effect to one of the plurality of groups of information may include assigning a random visual effect that contrasts from other visual effects by at least a threshold amount difference in a visual characteristic from other visual effects assigned to the other groups of information. Such characteristics may include, for example, a threshold difference in hue, brightness, coloration, tint, tone, darkness, contrast, brightness, or any other measurable characteristic associated with visual impression and identity. For instance, in the example shown in FIG. 4, a color that is chromatically opposite to the color of group 404, may be selected for group 406 to increase the contrast between two consecutive groups of information. In some embodiments, wherein the visual effects may correspond to a unique color associated with each group of the plurality of groups of information, assigning a random visual effect to one of the plurality of groups of information may include assigning a color belonging to a predetermined group of colors, or a color palette, different from the colors already assigned to the other groups of information. In some embodiments, a user may provide, and at least one processor may receive, input associated with assigning unique colors from a color palette to one or more groups of information for practical or aesthetic reasons. If the at least one processor determines that no color has been assigned to certain other groups of information, a color from the remaining colors in the palette may be randomly selected and assigned to those other groups of information.


In some embodiments, in response to a determination that elements from different groups of information are combined to form a new group of information, at least one processor may assign to the new group of information a visual effect that corresponds to a combination of visual effects of the different groups of information from which the elements originate. As discussed herein, a new group of information may refer to a group of information that did not exist prior to an original assignment of visual effects, or existed as multiple separate groups. The new group of information may be considered part of the plurality of group of information, and an associated size of the new group of information may contribute to the cumulative size of all groups of information. In some embodiments, the cumulative size may remain constant, if the combined group does not differ in dimension from the sum of the groups of information that were combined. FIG. 5 is an illustration of an exemplary plurality of groups of information, including, first group 502, second group 504, and third group 506. As shown, elements from two different groups of information (first group 502 and second group 504) are combined to form a new group of information, in accordance with disclosed embodiments. As shown in FIG. 5, a new group “1-bis” 508 is created from elements originally belonging to the first group 502 and second group 504. A processor may assign a visual effect to the new group of information 508 based on a combination of the visual effects assigned to the first 502 and second 504 group of information, such as by combining one or more of the color(s) and texture(s) from first group 502 and from second group 504. In some embodiments, in response to the determination that a new group is created, a new section corresponding to the new group may be added to the scroll bar (not shown in FIG. 5) at a position corresponding to the position of the new group and with a dimension (such as length) that is proportional to the associated size of the new group relative to the other groups of information.


In some embodiments, at least one processor may receive a continued scrolling signal following the initial scrolling signal, and in response to the continued scrolling signal, may cause the page to scroll on the display while maintaining a static position of the scroll bar on the display. In the context of this description, a continuous scrolling signal may refer to a scrolling signal received after an initial scrolling signal is completed. In some embodiments, the continuous scrolling signal may comprise a version of the initial scrolling signal that persists beyond a predetermined threshold period of time. In some embodiments, the continuous scrolling signal may cause a page currently presented on the display to scroll based on a direction of the received signal, so that the page presented on the display during the scroll differs from the page initially presented prior to the scroll. In some embodiments, a continuous scrolling signal may scroll the presented page over a distance less than, greater than, or equal to one of the dimensions of the display. Note that during the course of operations of disclosed processes, one or more continuous scrolling signals may be received, therefore a continuous scrolling signal may correspond to a scrolling signal received after a previous continuous scrolling signal is completed.


In some embodiments, the scroll bar may maintain a fixed and constant position on the display during a scroll, to maintain a static position. That is, a location of the scroll bar may remain unchanged and invariable. In some embodiments, the scroll bar may disappear from the display. The at least one processor may terminate display of the scroll bar in various scenarios, such as after a predetermined time after an initial scrolling signal is completed or no longer received, or after a predetermined time following the last receipt of a continuous scroll signal. In some embodiments, this predetermined time may be as small as zero seconds, such that the scroll bar disappears immediately after completion of the initial and/or continuous scrolling signals. In some embodiments, the scroll bar may remain on the display if a continuous scrolling signal is received after the initial scrolling signal is completed and before a predetermined time period elapses. For example, once the initial scrolling signal is received and completed, the scroll bar may remain on the display if a continuous scrolling signal is received before elapse of a time period of 1, 2, 5 seconds or any suitable time. In some other embodiments, the scroll bar may be configured to disappear from the display after a predetermined time just after a continuous scrolling signal is completed. For example, once the continuous scrolling signal is received and over, the scroll bar may disappear from the display after 1, 2, 5 seconds or any suitable time period. Additionally, in some embodiments, wherein a time between an end of one continuous scrolling signal and the receipt of another continuous scrolling signal is less than a predetermined time, the scroll bar may be configured to remain on the display. For example, if the delay between the completion of a first continuous scrolling signal and the reception of a second continuous scrolling signal is less than 1, 2, 5 seconds or any suitable time, the scroll bar may be configured to remain on the display.


In some embodiments, in response to the detection of the initial scrolling signal or the continuous scrolling signal, at least one processor may augment the display with a variable visual effect cursor located near the scroll bar. Furthermore, during scrolling within a particular group of information, the at least one processor may display a visual effect of the variable visual effect cursor that corresponds to the visual effect on the scroll bar associated with the particular group of information. A cursor may refer to an on-display indicator, icon, or other graphical element used to mark a position on a display. Additionally, a cursor may correspond to and indicate a position on a display that will be affected by a user input. Thus, the indicated position may be a place at which a user last interacted with the display, or a place on the display where at least one processor would effect a change in response to a received input.



FIGS. 6A-6C illustrate examples of a plurality of groups of information presented on a display 608, consistent with disclosed embodiments. As shown, a scroll bar 610 is divided into sections of differing visual effects, and a variable visual effect cursor (618a, 618b, 618c), consistent with the disclosed embodiments. In these figures, the entire page constituted by the plurality of groups of information (602, 604, 606) remains the same. The groups of information in the presented page on the display change in a manner corresponding to changes in the appearance and position of the variable visual effect cursor (618a, 618b, 618c) along the scroll bar 610.


In FIG. 6A, the presented page includes the first group of information 602, and accordingly, the variable visual effect cursor 618a is located along the scroll bar 610 adjacent first section 612 of scroll bar 610. An appearance of visual effect cursor 618a has a visual effect that is assigned to the first group 602. In some embodiments, the first group of information 602 may or may not include the assigned visual effect on the display.


In FIGS. 6B and 6C, a scroll may cause the presented page to transition to the second group of information 604 (FIG. 6B) and then to the third group 606 of information (FIG. 6C). As shown, the variable visual effect cursor (618b, 618c) may transition along the scroll bar 610 accordingly. In FIG. 6B, variable visual effect cursor 618b is positioned adjacent a respective location of the second section 614 of scroll bar 610. In FIG. 6C, variable visual effect cursor 618c is positioned adjacent a respective location of third section 616. As shown, variable visual effect cursor (618b, 618c) may have a visual effect corresponding to the second group 604 and third group 606, shown in FIGS. 6B and 6C, respectively.


As shown in FIGS. 6A-6C, variable visual effect cursor (618a, 618b, 618c) may have a shape such as a 5-sided icon, but the shape of the cursor is not limited to the illustrated examples. Although FIGS. 6A-6C depict a variable visual effect cursor (618a, 618b, 618c) adopting a pentagon shape, it is to be understood that any other suitable shape such as a square, rectangle, disk, triangle, arrow, or any other shape or appearance that conveys a position on scroll bar 610 may be used. In some embodiments, a user input associated with “dragging” the variable visual effect cursor (618a, 618b, 618c) to a particular location along the scroll bar 610 may scroll the page to a corresponding particular location in the entire page. Additionally, in some embodiments, dragging the variable visual effect cursor (618a, 618b, 618c) may scroll the page with a scrolling speed higher than a scrolling speed associated to an initial scrolling signal or a continuous scrolling signal. In some embodiments, the scrolling speed may correspond to a ratio between a size of the entire page and a size of the scroll bar or size of the display. In some embodiments, the scrolling speed may be predetermined and fixed. In some embodiments, the scrolling speed may be determined dynamically based on user input.


In some embodiments, the variable visual effect cursor (618a, 618b, 618c) may be configured to disappear from the display 608 after a predetermined time period. In some embodiments, scroll bar 610 may be configured to disappear from display 608 after a predetermined time just after an initial or a continuous scrolling signal is completed, and variable visual effect cursor (618a, 618b, 618c) may be configured to disappear from the display 608 after a time less than or equal to the predetermined time. For example, if scroll bar is configured to disappear from the display after 2 seconds following last receipt of a scrolling signal, the variable visual effect cursor may disappear after 1 second, 1.5 seconds, or at 2 seconds to disappear simultaneously with the scroll bar.


In some embodiments, in response to at least one of the initial scrolling signal or the continued scrolling signal, at least one processor may be configured to cause a pop-up window to appear displaying a name of one of the plurality of groups of information associated with a current position of scrolling. In some embodiments, a visual effect of the pop-up window may correspond to the visual effect on the scroll bar associated with the particular group of information. In some embodiments, the pop-up window may be displayed at a location near the scroll bar at a position corresponding to the presented page. For example, the pop-up window may be adjacent the scroll bar, at a position along the scroll bar corresponding to a positioning of a group of information in the presented page. In the context of this description, a pop-up window may refer to a visual element that appears an overlay that is above an existing presented page on a screen. In some embodiments, content in the presented page may be rearranged around the pop-up window, so that content is not obscured by the pop-up window. Pop-up windows may appear in various sizes and positions on a display. In addition, pop-ups may present information and correspond to a GUI component with possible user interactions. In some embodiments, a pop-up window may appear next to or in place of the variable visual effect cursor of FIGS. 6A-6C.



FIGS. 7A and 7B illustrate examples of a display 708 presenting a page having a plurality of groups of information, showing a representation of a scroll bar 710 divided into sections of differing visual effects, and a pop-up window (718a, 718b), consistent with the disclosed embodiments. As shown, the entire page is constituted by the same three groups of information (702, 704, 706). Differences between FIG. 7A and FIG. 7B include the page presented on the display 708, and a differing appearance and position of a pop-up window (718a, 718b) along scroll bar 710. As shown in FIG. 7A, the presented page corresponds to the first group of information 702, and pop-up window 718a is displayed adjacent to a position along scroll bar 710 in first section 712. Consistent with the description of the variable visual effect cursor, pop-up window 718a is assigned a visual effect corresponding to the visual effect assigned to the first group 702. As shown in FIG. 7A, pop-up window 718a may also display identifying information about first group 702, such as a name or title of the first group 702. In FIG. 7B the presented page may correspond to the second group of information 704 due to a scroll, and accordingly, pop-up window 718b may be located adjacent to a position along the scroll bar 710 in second section 714, where the position in second section 714 corresponds to the displayed portion of the second group 704. As shown, pop-up window 718b may adopt a visual effect corresponding to the visual effect assigned to the second group 704. Similar to pop-up window 718a, pop-up window 718b may display identifying information about second group 704, such as a name or title of the second group 704.


Although FIGS. 7A and 7B illustrate a pop-up window (718a, 718b) adopting a rectangular shape, it is to be understood that any other suitable shape such as an oval, a speech bubble, a square etc. may be used. It is also to be understood that different ways of implementing the visual effect in the pop window may be possible. As discussed above, in some embodiments, the visual effect of the pop-up window may correspond to a visual effect of a text displayed in the pop-up window. For example, a color and/or texture of the pop-up window may correspond to the visual effect of the corresponding group of information. In some embodiments, the pop-up window may have a different color, including a fixed color, and information displayed in the pop-up window, such as text displayed in the pop-up window, may have a color and/or texture corresponding to the visual effect of the corresponding group of information.


In some embodiments, the pop-up window (718a, 718b) may be configured to disappear from the display 708 after a predetermined time. In some other embodiments, wherein the scroll bar 610 is configured to disappear from the display 708 after a predetermined time, the pop-up window (718a, 718b) may be configured to disappear from the display 708 after a time less than or equal to the predetermined time.


In some embodiments, a variable visual effect cursor, such as the one described above, may be displayed in conjunction with the pop-up window. For example, after receiving at least one of the initial scrolling signal or the continued scrolling signal, both a pop window and a variable effect cursor may be displayed. In some embodiments, an input associated with dragging the variable visual effect cursor (such as the cursor shown in FIGS. 6A-6C) in a direction different than the scroll direction, such as dragging the cursor in a direction perpendicular to a length of the scroll bar, may cause the display of a pop-up window such as the one described above to be displayed. For instance, in the example shown in FIG. 6A, an input associated with selecting and/or horizontally dragging the variable visual effect cursor 618a may cause display of a pop window similar to the one illustrated in FIG. 7A (718a). The pop-up window may be displayed in place of the cursor, resulting in an appearance similar to FIG. 7A or 7B. Alternatively, the pop-up window may be displayed near the cursor (not shown).


In some embodiments, in response to the detection of the initial scrolling signal or the continuous scrolling signal, at least one processor may trigger a haptic signal at a transition from one of the plurality of groups of information to an adjacent one of the plurality of groups of information. Haptic signals can include any type of communication signal related to the sense of touch. Haptic signals may be implemented in a device in many ways. For example, a device can interact with a user by applying tactile, vibrotactile, electro-tactile, thermal, force feedback, or any other type of feedback that can be felt by a user. Additionally, or alternatively, a sound effect may be triggered at the transitions from one of the groups of information to an adjacent group of information.



FIG. 8 illustrates an exemplary presentation of a page having one or more groups of information on a display 808. At the right hand of display 808, a scroll bar 810 is divided into sections of differing visual effects that correspond to the plurality of groups of information in the entire page. As shown on the left side of FIG. 8, the presented page includes a first group of information 802 and a second group of information 804, and a variable visual effect cursor 818 positioned along scroll bar 810 adjacent to section 812. In the example shown, section 812 corresponds to the first group of information 802, and section 814 corresponds to the second group of information 804. A scrolling signal may cause the presented page to scroll and transition from the left-hand appearance to the appearance shown on the right, in which the presented page corresponds to the second group of information 804. During the scroll, variable visual effect cursor 818 may move along scroll bar 810 from section 812 to section 814. Between the two situations illustrated on FIG. 8, a transition occurs between the first 802 and second 804 groups of information being displayed, and a haptic signal may be triggered at this transition indicating a transition between two different groups of information. In the example shown, in response to the variable visual effect cursor 818 transitioning from first section 812 to second section 814, a haptic signal 820 may be triggered. Additionally, or alternatively, a sound effect may be triggered at the transitions from one of the groups of information to an adjacent group of information. As a non-limiting example, a click or tick sound may replace or supplement the triggered haptic signal.


In some embodiments, at least one processor may determine that the length of one of the sections is smaller than a predetermined length threshold. During scrolling in the one of the sections, the at least one processor may cause display of an enlarged version of the one of the sections. In some embodiments, the enlarged version may be presented adjacent to the scroll bar. An enlarged version of a section may refer to a version of the section where at least one of the dimensions of the section has been increased beyond an original dimension of the section in the scroll bar as it was originally displayed. For example, an enlarged version of a section may correspond to the display of the section with either an increased length, width or both dimensions. In some embodiments, the predetermined threshold may include at least one of a predetermined percentage of a primary dimension of the scroll bar, a predetermined number of pixels of the display, or may be manually defined by a user input.



FIGS. 9A-9C illustrate examples of graphical user interfaces with scroll bars consistent with disclosed embodiments. As shown, a page is presented with a plurality of groups of information (a first group 902, second group 904, and third group 906), and a scroll bar 910 is divided into sections of differing visual effects corresponding to each of the groups. For these figures, the entire page is made up of three groups of information (902, 904, 906), and the presented page differs between FIGS. 9A, 9B, and 9C, with differences in the appearance and position of a variable visual effect cursor 918 (similar to the cursor described above) along the scroll bar 910 and in the appearance of scroll bar 910. The second group 904 has an associated size smaller than the two other groups (902, 906), as a result, the length of the corresponding section 914 may be smaller than a predetermined threshold. A processor may determine that the size of section 914 is less than a predetermined threshold, based on a dimension of section 914 or other stored information indicative of the section size.


In FIG. 9A, the presented page corresponds to the first group of information 902, accordingly, the variable visual effect cursor 918 is located at a position along scroll bar 910 next to the first section 912. In FIG. 9B, the presented page is scrolled to move the second group of information 904 to the top of the presented page. The length of the corresponding section 914, being proportional to the associated size of the second group 904, may be determined to be smaller than a predetermined threshold (for example, less than 10% of the total vertical dimension of scroll bar 910). In response to this determination, at least one processor may generate and display an enlarged version (914b) of section 914 along the scroll bar 910. In this example, enlarged version 914b may be displayed in an expanded view of the scroll bar 910, such as by displaying enlarged version 914b adjacent a position of section 914. Variable visual effect cursor 918 may be displayed along the enlarged version of section 914. In some embodiments, a position of the variable visual effect cursor 918 along the enlarged version of the section may be proportional to a portion of the corresponding group of information displayed on the presented page. As such, a higher proportionality factor may be applied for the enlarged version 914b compared to the other sections (912, 916).


In response to a continuous scroll signal that scrolls past the second group 904 and into section 916 corresponding to the third group 906, enlarged version 914b may disappear or revert to section 914, as shown in FIG. 9C. As shown the presented page corresponds to the third group of information 906, accordingly, the variable visual effect cursor 918 may be located along scroll bar 910 adjacent a corresponding position in third section 916.


In some embodiments, a processor may determine that the length of one of the sections is smaller than a predetermined length threshold, and during scrolling in the one of the sections, may cause display of an enlarged version of the one of the sections within the scroll bar, rather than adjacent to the scroll bar. Referring to FIG. 9B, instead of displaying the enlarged version 914b of section 914 adjacent to the scroll bar 910, enlarged version 914b may be displayed within scroll bar 910, squeezing or displacing the other sections (912, 916). In some embodiments, enlarged version 914b may be superimposed over section 914. As such, a higher proportionality factor may be applied for scrolling within expanded section 914b, and a lower proportionality factor may be applied for scrolling other sections (912, 916), similar to a proportionality factor applied during scrolling before scrolling section 914.


In some embodiments, during scrolling of one of the sections, a processor may be configured to modify a visual appearance of the one of the sections, such as by increasing a width of the one of the sections. Modifying a visual appearance of a section may include changing any type of visual characteristics of the section resulting in a different visual representation of the section. FIGS. 10A-10C illustrate a presented page with a first group of information 1002 and a second group of information 1004. Although not shown on the presented page, the entire page may also include a third group of information. A scroll bar may be divided into sections of differing visual effects, including first section 1012 corresponding to first group 1002, second section 1014 corresponding to the second section 1004, and third section 1016 corresponding to the third group (not shown).


In some embodiments, one of the sections that is currently being scrolled through may have a modified visual appearance. In some embodiments such as the embodiment shown in FIG. 10A, the visual appearance of the one of the sections may be modified by increasing a width of the one of the sections. As illustrated in FIG. 10A, section 1012 currently being scrolled corresponds to the first group of information 1002. During scrolling through the first group 1002, a width of section 1012 may be increased relative to the width of scroll bar 1010 and the width of the other sections (1014, 1016), thereby enhancing the user's experience and ability to quickly identify the section of scroll bar 910 that is currently being scrolled. As a result, a user may be able to more quickly and easily understand what portion of the entire page is currently being navigated. In some embodiments, modifying the visual appearance of the one of the sections may include changing an appearance of the contour of the one of the sections. Referring to FIG. 10B, the section 1012 currently being scrolled corresponds to the first group of information 1002, and a contour of section 1012 may be emphasized or highlighted by using a an outline, shadow, or glow, thereby further distinguishing first section 1012 from the other sections (1014, 1016). As another example, modifying the visual appearance of the one of the sections may include adjusting an opacity or transparency of one or more sections. For example, an opacity of the one of the sections may be maintained, while reducing the opacity of sections other than the one of the sections, thereby increasing a transparency of those other sections. For example, as illustrated in FIG. 10C, section 1012 is currently being scrolled and corresponds to the first group of information 1002. An opacity of the visual effect of section 1012 may be maintained while reducing the opacity of the other sections (1014, 1016), distinguishing first section 1012 from the other sections (1014, 1016). If the presented page includes content that is normally obscured by the scroll bar 1010, then reducing an opacity (or increasing transparency) may cause the obscured content to become visible or partially visible. Although FIGS. 10A-10C depict three exemplary processes modifying the appearance of section 1012, any other suitable visual characteristics of section 1012 may be changed, in order to visually distinguish first section 1012 from the other sections (1014, 1016).



FIG. 11 is an illustration of another exemplary graphical user interface consistent with disclosed embodiments. As shown, the presented page shown in FIG. 11 includes a group of information 1104 (“Group B”), a scroll bar 1110 is located on the right side of the display 1108, divided into sections (1112, 1114, 1116, 1118, 1120, 1122, and 1124), each section having a different and unique visual effect, portrayed using texture patterns in this example. In this example, display 1108 may be a smartphone or tablet device screen. A variable visual effect cursor 1126 is positioned adjacent section 1114 of scroll 1110 bar, at a position that corresponds to scrolling in group 1104. Cursor 1126 is depicted near the top of section 1114, indicating to a user that the presented page represents content near the top of Group B. Groups of information in this example may correspond to a data table comprising a plurality of rows and columns, as shown. The cumulative size of all the groups of information (here seven groups represented by sections 1112-1124) is larger than the two dimensions of the smartphone screen 1108. In the displayed example, a second and horizontal standard scroll bar (not displayed in FIG. 11) may be provided in order to allow for horizontal scrolling of the table content.



FIG. 12 is a block diagram of an exemplary computing device 1200 consistent with some embodiments. In some embodiments, computing device 1200 may be similar in type and function to user device 1320, discussed with respect to FIG. 13. As shown in FIG. 12, computing device 1200 may include processing circuitry 1210, such as, for example, a central processing unit (CPU). In some embodiments, the processing circuitry 1210 may include, or may be a component of, a larger processing unit implemented with one or more processors. The one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate array (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information. The processing circuitry such as processing circuitry 1210 may be coupled via a bus 1205 to a memory 1220.


The memory 1220 may further include a memory portion 1222 that may contain instructions that when executed by the processing circuitry 1210, may perform the method described in more detail herein. The memory 1220 may be further used as a working scratch pad for the processing circuitry 1210, a temporary storage, and others, as the case may be. The memory 1220 may be a volatile memory such as, but not limited to, random access memory (RAM), or non-volatile memory (NVM), such as, but not limited to, flash memory. The processing circuitry 110 may be further connected to a network device 1240, such as a network interface card, for providing connectivity between the computing device 1200 and a network, such as a network 1310, discussed in more detail with respect to FIG. 13 below. The processing circuitry 1210 may be further coupled with a storage device 1230. The storage device 1230 may be used for the purpose of storing single data type column-oriented data structures, data elements associated with the data structures, or any other data structures. While illustrated in FIG. 12 as a single device, it is to be understood that storage device 1230 may include multiple devices either collocated or distributed.


The processing circuitry 1210 and/or the memory 1220 may also include machine-readable media for storing software. “Software” as used herein refers broadly to any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the one or more processors, may cause the processing system to perform the various functions described in further detail herein.


In some embodiments, computing device 1200 may include one or more input and output devices (not shown in figure). Computing device may also include a display 1250, such as a touchscreen display or other display types discussed herein.



FIG. 13 is a block diagram of computing architecture 1300 that may be used in connection with various disclosed embodiments. The computing device 1200, as described in connection with FIG. 12, may be coupled to network 1310. The network 1310 may enable communication between different elements that may be communicatively coupled with the computing device 1200, as further described below. The network 1310 may include the Internet, the world-wide-web (WWW), a local area network (LAN), a wide area network (WAN), a metro area network (MAN), and other networks capable of enabling communication between the elements of the computing architecture 1300. In some disclosed embodiments, the computing device 1200 may be a server deployed in a cloud computing environment.


One or more user devices 1320-1 through user device 1320-m, where ‘m’ in an integer equal to or greater than 1, referred to individually as user device 1320 and collectively as user devices 1320, may be communicatively coupled with the computing device 1200 via the network 1310. A user device 1320 may be for example, a smart phone, a mobile phone, a laptop, a tablet computer, a wearable computing device, a personal computer (PC), a smart television and the like. A user device 1320 may be configured to send to and receive from the computing device 1200 data and/or metadata associated with a variety of elements associated with single data type column-oriented data structures, such as columns, rows, cells, schemas, and the like.


One or more data repositories 1330-1 through data repository 1330-n, where ‘n’ in an integer equal to or greater than 1, referred to individually as data repository 1330 and collectively as data repository 1330, may be communicatively coupled with the computing device 1200 via the network 1310, or embedded within the computing device 1200. Each data repository 1330 may be communicatively connected to the network 1310 through one or more database management services (DBMS) 1335-1 through DBMS 1335-n. The data repository 1330 may be for example, a storage device containing a database, a data warehouse, and the like, that may be used for storing data structures, data items, metadata, or any information, as further described below. In some embodiments, one or more of the repositories may be distributed over several physical storage devices, e.g., in a cloud-based computing environment. Any storage device may be a network accessible storage device, or a component of the computing device 1200.


The embodiments disclosed herein are exemplary and any other means for performing and facilitating display navigation operations may be consistent with this disclosure.


The foregoing description has been presented for purposes of illustration. It is not exhaustive and is not limited to the precise forms or embodiments disclosed. Modifications and adaptations will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed embodiments.


Moreover, while illustrative embodiments have been described herein, the scope of any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations and/or alterations as would be appreciated by those skilled in the art based on the present disclosure. The limitations in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application. The examples are to be construed as non-exclusive. Furthermore, the steps of the disclosed methods may be modified in any manner, including by reordering steps and/or inserting or deleting steps. It is intended, therefore, that the specification and examples be considered as illustrative only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.


Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. The materials, methods, and examples provided herein are illustrative only and not intended to be limiting.


Implementation of the method and system of the present disclosure may involve performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present disclosure, several selected steps may be implemented by hardware (HW) or by software (SW) on any operating system of any firmware, or by a combination thereof. For example, as hardware, selected steps of the disclosure could be implemented as a chip or a circuit. As software or algorithm, selected steps of the disclosure could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the disclosure could be described as being performed by a data processor, such as a computing device for executing a plurality of instructions.


As used herein, the terms “machine-readable medium” “computer-readable medium” refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.


Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.


Although the present disclosure is described with regard to a “computing device”, a “computer”, or “mobile device”, it should be noted that optionally any device featuring a data processor and the ability to execute one or more instructions may be described as a computing device, including but not limited to any type of personal computer (PC), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, a smart watch or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a “network” or a “computer network”.


To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (a LED (light-emitting diode), or OLED (organic LED), or LCD (liquid crystal display) monitor/screen) for displaying information to the user and a touch-sensitive layer such as a touchscreen, or keyboard and a pointing device (e.g., a mouse or a trackball), by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.


The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.


It should be appreciated that the above described methods and apparatus may be varied in many ways, including omitting or adding steps, changing the order of steps and the type of devices used. It should be appreciated that different features may be combined in different ways. In particular, not all the features shown above in a particular embodiment or implementation are necessary in every embodiment or implementation of the invention. Further combinations of the above features and implementations are also considered to be within the scope of some embodiments or implementations of the invention.


While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the implementations. It should be understood that they have been presented by way of example only, not limitation, and various changes in form and details may be made. Any portion of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The implementations described herein can include various combinations and/or sub-combinations of the functions, components and/or features of the different implementations described.


Systems and methods disclosed herein involve unconventional improvements over conventional approaches. Descriptions of the disclosed embodiments are not exhaustive and are not limited to the precise forms or embodiments disclosed. Modifications and adaptations of the embodiments will be apparent from consideration of the specification and practice of the disclosed embodiments. Additionally, the disclosed embodiments are not limited to the examples discussed herein.


The foregoing description has been presented for purposes of illustration. It is not exhaustive and is not limited to the precise forms or embodiments disclosed. Modifications and adaptations of the embodiments will be apparent from consideration of the specification and practice of the disclosed embodiments. For example, the described implementations include hardware and software, but systems and methods consistent with the present disclosure may be implemented as hardware alone.


It is appreciated that the above described embodiments can be implemented by hardware, or software (program codes), or a combination of hardware and software. If implemented by software, it can be stored in the above-described computer-readable media. The software, when executed by the processor can perform the disclosed methods. The computing units and other functional units described in the present disclosure can be implemented by hardware, or software, or a combination of hardware and software. One of ordinary skill in the art will also understand that multiple ones of the above described modules/units can be combined as one module or unit, and each of the above described modules/units can be further divided into a plurality of sub-modules or sub-units.


The block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer hardware or software products according to various example embodiments of the present disclosure. In this regard, each block in a flowchart or block diagram may represent a module, segment, or portion of code, which includes one or more executable instructions for implementing the specified logical functions. It should be understood that in some alternative implementations, functions indicated in a block may occur out of order noted in the figures. For example, two blocks shown in succession may be executed or implemented substantially concurrently, or two blocks may sometimes be executed in reverse order, depending upon the functionality involved. Some blocks may also be omitted. It should also be understood that each block of the block diagrams, and combination of the blocks, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or by combinations of special purpose hardware and computer instructions.


In the foregoing specification, embodiments have been described with reference to numerous specific details that can vary from implementation to implementation. Certain adaptations and modifications of the described embodiments can be made. Other embodiments can be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as example only, with a true scope and spirit of the invention being indicated by the following claims. It is also intended that the sequence of steps shown in figures are only for illustrative purposes and are not intended to be limited to any particular sequence of steps. As such, those skilled in the art can appreciate that these steps can be performed in a different order while implementing the same method.


It will be appreciated that the embodiments of the present disclosure are not limited to the exact construction that has been described above and illustrated in the accompanying drawings, and that various modifications and changes may be made without departing from the scope thereof.


Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed embodiments disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosed embodiments being indicated by the following claims.


Computer programs based on the written description and methods of this specification are within the skill of a software developer. The various programs or program modules can be created using a variety of programming techniques. One or more of such software sections or modules can be integrated into a computer system, non-transitory computer readable media, or existing software.


This disclosure employs open-ended permissive language, indicating for example, that some embodiments “may” employ, involve, or include specific features. The use of the term “may” and other open-ended terminology is intended to indicate that although not every embodiment may employ the specific disclosed feature, at least one embodiment employs the specific disclosed feature.


Various terms used in the specification and claims may be defined or summarized differently when discussed in connection with differing disclosed embodiments. It is to be understood that the definitions, summaries and explanations of terminology in each instance apply to all instances, even when not repeated, unless the transitive definition, explanation or summary would result in inoperability of an embodiment.


Moreover, while illustrative embodiments have been described herein, the scope includes any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations or alterations based on the present disclosure. The elements in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application. These examples are to be construed as non-exclusive. Further, the steps of the disclosed methods can be modified in any manner, including by reordering steps or inserting or deleting steps. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.

Claims
  • 1. A non-transitory computer-readable medium containing instructions that, when executed, cause at least one processor to perform display navigation operations, the operations comprising: presenting a plurality of groups of information on a display, in a form of a page, each of the plurality of groups of information having an associated size, wherein a cumulative size of the plurality of groups of information is larger than at least one dimension of the display;receiving an initial scrolling signal for causing the presented page to scroll on the display;augmenting the display with a scroll bar divided into sections of differing visual effects, wherein the differing visual effects of each of the sections correspond to an associated visual effect assigned to one group of the plurality of groups of information,a length of each of the sections is proportional to the associated size of the one group relative to the cumulative size of the plurality of groups,an order of the visual effects in the scroll bar corresponds to an order of the groups of information in the presented page, andeach section of the scroll bar is directly associated with each group of information;responsive to detection of the initial scrolling signal, augmenting the display with a variable visual effect cursor located adjacent to the scroll bar;during scrolling within a particular group of information, displaying a visual effect of the variable visual effect cursor that corresponds to a visual effect on the scroll bar section associated with the particular group of information; andin response to detecting a dragging of the variable visual effect cursor to a particular location along the scroll bar, causing the page to scroll to a corresponding particular location in the entire page.
  • 2. The non-transitory computer-readable medium of claim 1, wherein all of the sections are presented on the display while the scroll bar is presented.
  • 3. The non-transitory computer-readable medium of claim 1, wherein an entire page includes the presented page and one or more groups of information that are not displayed in the presented page, andan interaction with a particular location on the scroll bar scrolls the page to a corresponding particular location in the entire page.
  • 4. The non-transitory computer-readable medium of claim 1, wherein the differing visual effects include a unique color associated with each group of the plurality of groups of information, or a unique combination of a color and a texture associated with each group of the plurality of groups of information.
  • 5. The non-transitory computer-readable medium of claim 4, wherein the operations further comprise: responsive to a determination that a same color is assigned to more than one of the plurality of groups of information, assigning a different texture to each group of the plurality of groups of information having the same color.
  • 6. The non-transitory computer-readable medium of claim 1, wherein each of the plurality of groups of information has a predetermined visual effect.
  • 7. The non-transitory computer-readable medium of claim 1, wherein the operations further comprise: responsive to a determination that one of the plurality of groups of information was not assigned with a visual effect prior to the scroll, assigning a random visual effect to the one of the plurality of groups of information, different from other visual effects assigned to another group of information.
  • 8. The non-transitory computer-readable medium of claim 1, wherein the operations further comprise: responsive to a determination that elements from different groups of information are combined to form a new group of information, assigning to the new group of information a visual effect that corresponds to a combination of visual effects of the different groups of information from which the elements originate.
  • 9. The non-transitory computer-readable medium of claim 1, wherein the operations further comprise: receiving a continued scrolling signal following the initial scrolling signal; andin response to the continued scrolling signal, cause the page to scroll on the display while maintaining a static position of the scroll bar on the display.
  • 10. The non-transitory computer-readable medium of claim 9, wherein the operations further comprise: responsive to at least one of the initial scrolling signal or the continued scrolling signal, causing a pop-up window to appear displaying a name of one of the plurality of groups of information associated with a current position of scrolling,wherein a visual effect of a text displayed in the pop-up window corresponds to the visual effect on the scroll bar associated with the particular group of information.
  • 11. The non-transitory computer-readable medium of claim 1, wherein the operations further comprise: responsive to detection of the initial scrolling signal, augmenting the display with a variable visual effect cursor located near the scroll bar;during scrolling within a particular group of information, displaying a visual effect of the variable visual effect cursor that corresponds to the visual effect on the scroll bar associated with the particular group of information; and,in response to detecting a dragging of the variable visual effect cursor to a particular location along the scroll bar, cause the page to scroll to a corresponding particular location in the entire page.
  • 12. The non-transitory computer-readable medium of claim 1, wherein the operations further comprise: responsive to detection of the initial scrolling signal, triggering a haptic signal at a transition from one of the plurality of groups of information to an adjacent one of the plurality of groups of information.
  • 13. The non-transitory computer-readable medium of claim 1, wherein the operations further comprise: determining that the length of one of the sections is smaller than a predetermined length threshold; andduring scrolling in the one of the sections, causing display of an enlarged version of the one of the sections, wherein the enlarged version is presented adjacent to the scroll bar.
  • 14. The non-transitory computer-readable medium of claim 1, wherein the operations further comprise, during scrolling of one of the sections, increasing a width of the one of the sections.
  • 15. The non-transitory computer-readable medium of claim 1, wherein the variable visual effect cursor is configured to disappear from the display after a predetermined time period.
  • 16. A method for display navigation, comprising: presenting a plurality of groups of information on a display, in a form of a page, each of the plurality of groups of information having an associated size, wherein a cumulative size of the plurality of groups of information is larger than at least one dimension of the display;receiving an initial scrolling signal for causing the presented page to scroll on the display;augmenting the display with a scroll bar divided into sections of differing visual effects,wherein the differing visual effects of each of the sections correspond to an associated visual effect assigned to one group of the plurality of groups of information,a length of each of the sections is proportional to the associated size of the one group relative to the cumulative size of the plurality of groups,an order of the visual effects in the scroll bar corresponds to an order of the groups of information in the presented page, andeach section of the scroll bar is directly associated with each group of information;responsive to detection of the initial scrolling signal, augmenting the display with a variable visual effect cursor located adjacent to the scroll bar;during scrolling within a particular group of information, displaying a visual effect of the variable visual effect cursor that corresponds to a visual effect on the scroll bar section associated with the particular group of information; andin response to detecting a dragging of the variable visual effect cursor to a particular location along the scroll bar, causing the page to scroll to a corresponding particular location in the entire page.
  • 17. The method of claim 16, further comprising: responsive to detection of the initial scrolling signal, augmenting the display with a variable visual effect cursor located near the scroll bar;during scrolling within a particular group of information, displaying a visual effect of the variable visual effect cursor that corresponds to the visual effect on the scroll bar associated with the particular group of information; and,in response to detecting a dragging of the variable visual effect cursor to a particular location along the scroll bar, cause the page to scroll to a corresponding particular location in the entire page.
  • 18. The method of claim 16, further comprising: determining that the length of one of the sections is smaller than a predetermined length threshold; andduring scrolling in the one of the sections, causing display of an enlarged version of the one of the sections, wherein the enlarged version is presented adjacent to the scroll bar.
  • 19. A system for performing display navigation operations on a display having dimensions smaller than a page presented on the display, the system comprising: a memory storing instructions; andat least one processor that executes the stored instructions to:present a plurality of groups of information on a display, in a form of a page, each of the plurality of groups of information having an associated size, wherein a cumulative size of the plurality of groups of information is larger than at least one dimension of the display;receive an initial scrolling signal for causing the presented page to scroll on the display;augment the display with a scroll bar divided into sections of differing colors,wherein each of the sections is colored in a differing color assigned to one group of the plurality of groups of information,a length of each of the sections is proportional to the associated size of the one group relative to the cumulative size of the plurality of groups,an order of the colors in the scroll bar corresponds to an order of the groups of information in the presented page, andeach section of the scroll bar is directly associated with each group of information;responsive to detection of the initial scrolling signal, augment the display with a variable color cursor located adjacent to the scroll bar;during scrolling within a particular group of information, display a color of the variable color cursor that corresponds to a color on the scroll bar section associated with the particular group of information; andin response to detecting a dragging of the variable color cursor to a particular location along the scroll bar, cause the page to scroll to a corresponding particular location in the entire page.
  • 20. The system of claim 19, wherein the at least one processor is further configured to: responsive to detection of the initial scrolling signal, augmenting the display with a variable color cursor located near the scroll bar;during scrolling within a particular group of information, displaying a color of the variable color cursor that corresponds to a color of the scroll bar section associated with the particular group of information; and,in response to detecting a dragging of the variable color cursor to a particular location along the scroll bar, cause the page to scroll to a corresponding particular location in the entire page.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is based on and claims benefit of priority of U.S. Provisional Patent Application No. 63/273,453 filed on Oct. 29, 2021, the contents of all of which are incorporated herein by reference in their entireties.

US Referenced Citations (949)
Number Name Date Kind
4972314 Getzinger et al. Nov 1990 A
5220657 Bly et al. Jun 1993 A
5479602 Baecker et al. Dec 1995 A
5517663 Kahn May 1996 A
5632009 Rao et al. May 1997 A
5657437 Bishop et al. Aug 1997 A
5682469 Linnett et al. Oct 1997 A
5696702 Skinner et al. Dec 1997 A
5726701 Needham Mar 1998 A
5787411 Groff et al. Jul 1998 A
5880742 Rao et al. Mar 1999 A
5933145 Meek Aug 1999 A
6016438 Wakayama Jan 2000 A
6016553 Schneider et al. Jan 2000 A
6023695 Osborn et al. Feb 2000 A
6034681 Miller et al. Mar 2000 A
6049622 Robb et al. Apr 2000 A
6088707 Bates et al. Jul 2000 A
6108573 Debbins et al. Aug 2000 A
6111573 McComb et al. Aug 2000 A
6157381 Bates Dec 2000 A
6167405 Rosensteel, Jr. et al. Dec 2000 A
6169534 Raffel et al. Jan 2001 B1
6182127 Cronin et al. Jan 2001 B1
6185582 Zellweger et al. Feb 2001 B1
6195794 Buxton Feb 2001 B1
6222541 Bates Apr 2001 B1
6252594 Xia Jun 2001 B1
6266067 Owen et al. Jul 2001 B1
6275809 Tamaki et al. Aug 2001 B1
6330022 Seligmann Dec 2001 B1
6377965 Hachamovitch et al. Apr 2002 B1
6385617 Malik May 2002 B1
6460043 Tabbara et al. Oct 2002 B1
6496832 Chi et al. Dec 2002 B2
6509912 Moran et al. Jan 2003 B1
6510459 Cronin, III et al. Jan 2003 B2
6522347 Tsuji et al. Feb 2003 B1
6527556 Koskinen Mar 2003 B1
6567830 Madduri May 2003 B1
6606740 Lynn et al. Aug 2003 B1
6626959 Moise et al. Sep 2003 B1
6636242 Bowman-Amuah Oct 2003 B2
6647370 Fu et al. Nov 2003 B1
6661431 Stuart et al. Dec 2003 B1
6988248 Tang et al. Jan 2006 B1
7027997 Robinson et al. Apr 2006 B1
7034860 Lia et al. Apr 2006 B2
7043529 Simonoff May 2006 B1
7054891 Cole May 2006 B2
7228492 Graham Jun 2007 B1
7237188 Leung Jun 2007 B1
7249042 Doerr et al. Jul 2007 B1
7272637 Himmelstein Sep 2007 B1
7274375 David Sep 2007 B1
7379934 Forman et al. May 2008 B1
7380202 Lindhorst et al. May 2008 B1
7383320 Silberstein et al. Jun 2008 B1
7389473 Sawicki et al. Jun 2008 B1
7415664 Aureglia et al. Aug 2008 B2
7417644 Cooper et al. Aug 2008 B2
7461077 Greenwood Dec 2008 B1
7489976 Adra Feb 2009 B2
7565270 Bramwell et al. Jul 2009 B2
7617443 Mills et al. Nov 2009 B2
7685152 Chivukula et al. Mar 2010 B2
7707514 Forstall et al. Apr 2010 B2
7710290 Johnson May 2010 B2
7747782 Hunt et al. Jun 2010 B2
7770100 Chamberlain et al. Aug 2010 B2
7827476 Roberts et al. Nov 2010 B1
7827615 Allababidi et al. Nov 2010 B1
7836408 Ollmann Nov 2010 B1
7916157 Kelley et al. Mar 2011 B1
7921360 Sundermeyer et al. Apr 2011 B1
7933952 Parker et al. Apr 2011 B2
7945622 Pegg May 2011 B1
7954043 Bera May 2011 B2
7954064 Forstall et al. May 2011 B2
8046703 Busch et al. Oct 2011 B2
8060518 Timmons Nov 2011 B2
8078955 Gupta Dec 2011 B1
8082274 Steinglass et al. Dec 2011 B2
8108241 Shukoor Jan 2012 B2
8136031 Massand Mar 2012 B2
8151213 Weitzman et al. Apr 2012 B2
8223172 Miller et al. Jul 2012 B1
8286072 Chamberlain et al. Oct 2012 B2
8365095 Bansal et al. Jan 2013 B2
8375327 Lorch et al. Feb 2013 B2
8386960 Eismann et al. Feb 2013 B1
8407217 Zhang Mar 2013 B1
8413261 Nemoy et al. Apr 2013 B2
8423909 Zabielski Apr 2013 B2
8543566 Weissman et al. Sep 2013 B2
8548997 Wu Oct 2013 B1
8560942 Fortes et al. Oct 2013 B2
8566732 Louch et al. Oct 2013 B2
8572173 Briere et al. Oct 2013 B2
8578399 Khen et al. Nov 2013 B2
8601383 Folting et al. Dec 2013 B2
8620703 Kapoor et al. Dec 2013 B1
8621652 Slater, Jr. Dec 2013 B2
8635520 Christiansen et al. Jan 2014 B2
8677448 Kauffman et al. Mar 2014 B1
8719071 MacIntyre et al. May 2014 B2
8738414 Nagar et al. May 2014 B1
8812471 Akita Aug 2014 B2
8819042 Samudrala et al. Aug 2014 B2
8825758 Bailor et al. Sep 2014 B2
8838533 Kwiatkowski et al. Sep 2014 B2
8862979 Hawking Oct 2014 B2
8863022 Rhodes et al. Oct 2014 B2
8869027 Louch et al. Oct 2014 B2
8937627 Otero et al. Jan 2015 B1
8938465 Messer Jan 2015 B2
8954871 Louch et al. Feb 2015 B2
9007405 Eldar et al. Apr 2015 B1
9015716 Fletcher et al. Apr 2015 B2
9021118 John et al. Apr 2015 B2
9026897 Zarras May 2015 B2
9043362 Weissman et al. May 2015 B2
9063958 Müller et al. Jun 2015 B2
9129234 Campbell et al. Sep 2015 B2
9159246 Rodriguez et al. Oct 2015 B2
9172738 daCosta Oct 2015 B1
9177238 Windmueller et al. Nov 2015 B2
9183303 Goel et al. Nov 2015 B1
9223770 Ledet Dec 2015 B1
9239719 Feinstein et al. Jan 2016 B1
9244917 Sharma et al. Jan 2016 B1
9253130 Zaveri Feb 2016 B2
9286246 Saito et al. Mar 2016 B2
9286475 Li et al. Mar 2016 B2
9292587 Kann et al. Mar 2016 B2
9336502 Mohammad et al. May 2016 B2
9342579 Cao et al. May 2016 B2
9361287 Simon et al. Jun 2016 B1
9390059 Gur et al. Jul 2016 B1
9424287 Schroth Aug 2016 B2
9424333 Bisignani et al. Aug 2016 B1
9424545 Lee Aug 2016 B1
9430458 Rhee et al. Aug 2016 B2
9449031 Barrus et al. Sep 2016 B2
9495386 Tapley et al. Nov 2016 B2
9519699 Kulkarni et al. Dec 2016 B1
9558172 Rampson et al. Jan 2017 B2
9569511 Morin Feb 2017 B2
9613086 Sherman Apr 2017 B1
9635091 Laukkanen et al. Apr 2017 B1
9659284 Wilson et al. May 2017 B1
9679456 East Jun 2017 B2
9720602 Chen et al. Aug 2017 B1
9727376 Bills et al. Aug 2017 B1
9760271 Persaud Sep 2017 B2
9794256 Kiang et al. Oct 2017 B2
9798829 Baisley Oct 2017 B1
9811676 Gauvin Nov 2017 B1
9866561 Psenka et al. Jan 2018 B2
9870136 Pourshahid Jan 2018 B2
10001908 Grieve et al. Jun 2018 B2
10043296 Li Aug 2018 B2
10067928 Krappe Sep 2018 B1
10078668 Woodrow et al. Sep 2018 B1
10169306 O'Shaughnessy et al. Jan 2019 B2
10176154 Ben-Aharon et al. Jan 2019 B2
10235441 Makhlin et al. Mar 2019 B1
10255609 Kinkead et al. Apr 2019 B2
10282405 Silk et al. May 2019 B1
10282406 Bissantz May 2019 B2
10311080 Folting et al. Jun 2019 B2
10318624 Rosner et al. Jun 2019 B1
10327712 Beymer et al. Jun 2019 B2
10347017 Ruble et al. Jul 2019 B2
10372706 Chavan et al. Aug 2019 B2
10380140 Sherman Aug 2019 B2
10423758 Kido et al. Sep 2019 B2
10445702 Hunt Oct 2019 B1
10452360 Burman et al. Oct 2019 B1
10453118 Smith et al. Oct 2019 B2
10474317 Ramanathan et al. Nov 2019 B2
10489391 Tomlin Nov 2019 B1
10489462 Rogynskyy et al. Nov 2019 B1
10496737 Sayre et al. Dec 2019 B1
10505825 Bettaiah et al. Dec 2019 B1
10528599 Pandis et al. Jan 2020 B1
10534507 Laukkanen et al. Jan 2020 B1
10540152 Krishnaswamy et al. Jan 2020 B1
10540434 Migeon et al. Jan 2020 B2
10546001 Nguyen et al. Jan 2020 B1
10564622 Dean et al. Feb 2020 B1
10573407 Ginsburg Feb 2020 B2
10579724 Campbell et al. Mar 2020 B2
10587714 Kulkarni et al. Mar 2020 B1
10628002 Kang et al. Apr 2020 B1
10698594 Sanches et al. Jun 2020 B2
10706061 Sherman et al. Jul 2020 B2
10719220 Ouellet et al. Jul 2020 B2
10733256 Fickenscher et al. Aug 2020 B2
10740117 Ording et al. Aug 2020 B2
10747764 Plenderleith Aug 2020 B1
10747950 Dang et al. Aug 2020 B2
10748312 Ruble et al. Aug 2020 B2
10754688 Powell Aug 2020 B2
10761691 Anzures et al. Sep 2020 B2
10795555 Burke et al. Oct 2020 B2
10809696 Principato Oct 2020 B1
10817660 Rampson et al. Oct 2020 B2
D910077 Naroshevitch et al. Feb 2021 S
10963578 More et al. Mar 2021 B2
11010371 Slomka et al. May 2021 B1
11030259 Mullins et al. Jun 2021 B2
11042363 Krishnaswamy et al. Jun 2021 B1
11042699 Sayre et al. Jun 2021 B1
11048714 Sherman et al. Jun 2021 B2
11086894 Srivastava et al. Aug 2021 B1
11144854 Mouawad Oct 2021 B1
11222167 Gehrmann et al. Jan 2022 B2
11243688 Remy et al. Feb 2022 B1
11429384 Navert et al. Aug 2022 B1
11443390 Caligaris et al. Sep 2022 B1
11620615 Jiang et al. Apr 2023 B2
11682091 Sukman et al. Jun 2023 B2
20010008998 Tamaki et al. Jul 2001 A1
20010032248 Krafchin Oct 2001 A1
20010039551 Saito et al. Nov 2001 A1
20020002459 Lewis et al. Jan 2002 A1
20020065848 Walker et al. May 2002 A1
20020065849 Ferguson et al. May 2002 A1
20020065880 Hasegawa et al. May 2002 A1
20020069207 Alexander et al. Jun 2002 A1
20020075309 Michelman et al. Jun 2002 A1
20020082892 Raffel et al. Jun 2002 A1
20020099777 Gupta et al. Jul 2002 A1
20020138528 Gong et al. Sep 2002 A1
20030033196 Tomlin Feb 2003 A1
20030041113 Larsen Feb 2003 A1
20030051377 Chirafesi, Jr. Mar 2003 A1
20030052912 Bowman et al. Mar 2003 A1
20030058277 Bowman-Amuah Mar 2003 A1
20030065662 Cosic Apr 2003 A1
20030093408 Brown et al. May 2003 A1
20030101416 McInnes et al. May 2003 A1
20030135558 Bellotti et al. Jul 2003 A1
20030137536 Hugh Jul 2003 A1
20030187864 McGoveran Oct 2003 A1
20030200215 Chen et al. Oct 2003 A1
20030204490 Kasriel Oct 2003 A1
20030233224 Marchisio et al. Dec 2003 A1
20040032432 Baynger Feb 2004 A1
20040078373 Ghoneimy et al. Apr 2004 A1
20040098284 Petito et al. May 2004 A1
20040133441 Brady et al. Jul 2004 A1
20040138939 Theiler Jul 2004 A1
20040139400 Allam et al. Jul 2004 A1
20040162833 Jones et al. Aug 2004 A1
20040172592 Collie et al. Sep 2004 A1
20040212615 Uthe Oct 2004 A1
20040215443 Hatton Oct 2004 A1
20040230940 Cooper et al. Nov 2004 A1
20040268227 Brid Dec 2004 A1
20050034058 Mills et al. Feb 2005 A1
20050034064 Meyers et al. Feb 2005 A1
20050039001 Hudis et al. Feb 2005 A1
20050039033 Meyers et al. Feb 2005 A1
20050044486 Kotler et al. Feb 2005 A1
20050063615 Siegel et al. Mar 2005 A1
20050066306 Diab Mar 2005 A1
20050086360 Mamou et al. Apr 2005 A1
20050091314 Blagsvedt et al. Apr 2005 A1
20050091596 Anthony et al. Apr 2005 A1
20050096973 Heyse et al. May 2005 A1
20050114305 Haynes et al. May 2005 A1
20050125395 Boettiger Jun 2005 A1
20050165600 Kasravi et al. Jul 2005 A1
20050171881 Ghassemieh et al. Aug 2005 A1
20050210371 Pollock et al. Sep 2005 A1
20050216830 Turner et al. Sep 2005 A1
20050228250 Bitter et al. Oct 2005 A1
20050251021 Kaufman et al. Nov 2005 A1
20050257204 Bryant et al. Nov 2005 A1
20050278297 Nelson Dec 2005 A1
20050289170 Brown et al. Dec 2005 A1
20050289342 Needham et al. Dec 2005 A1
20050289453 Segal et al. Dec 2005 A1
20060009960 Valencot et al. Jan 2006 A1
20060013462 Sadikali Jan 2006 A1
20060015499 Clissold et al. Jan 2006 A1
20060015806 Wallace Jan 2006 A1
20060031148 O'Dell et al. Feb 2006 A1
20060031764 Keyser et al. Feb 2006 A1
20060036568 Moore et al. Feb 2006 A1
20060047811 Lau et al. Mar 2006 A1
20060053096 Subramanian et al. Mar 2006 A1
20060053194 Schneider et al. Mar 2006 A1
20060069604 Leukart et al. Mar 2006 A1
20060069635 Ram et al. Mar 2006 A1
20060080594 Chavoustie et al. Apr 2006 A1
20060085744 Hays et al. Apr 2006 A1
20060090169 Daniels et al. Apr 2006 A1
20060101324 Goldberg et al. May 2006 A1
20060106642 Reicher et al. May 2006 A1
20060107196 Thanu et al. May 2006 A1
20060111953 Setya May 2006 A1
20060129415 Thukral et al. Jun 2006 A1
20060129913 Vigesaa et al. Jun 2006 A1
20060136828 Asano Jun 2006 A1
20060150090 Swamidass Jul 2006 A1
20060173908 Browning et al. Aug 2006 A1
20060190313 Lu Aug 2006 A1
20060212299 Law Sep 2006 A1
20060224542 Yalamanchi Oct 2006 A1
20060224568 Debrito Oct 2006 A1
20060224946 Barrett et al. Oct 2006 A1
20060236246 Bono et al. Oct 2006 A1
20060250369 Keim Nov 2006 A1
20060253205 Gardiner Nov 2006 A1
20060271574 Villaron et al. Nov 2006 A1
20060287998 Folting et al. Dec 2006 A1
20060294451 Kelkar et al. Dec 2006 A1
20070027932 Thibeault Feb 2007 A1
20070032993 Yamaguchi et al. Feb 2007 A1
20070033531 Marsh Feb 2007 A1
20070050322 Vigesaa et al. Mar 2007 A1
20070050379 Day et al. Mar 2007 A1
20070073899 Judge et al. Mar 2007 A1
20070092048 Chelstrom et al. Apr 2007 A1
20070094607 Morgan et al. Apr 2007 A1
20070101291 Forstall et al. May 2007 A1
20070106754 Moore May 2007 A1
20070118527 Winje et al. May 2007 A1
20070118813 Forstall et al. May 2007 A1
20070143169 Grant et al. Jun 2007 A1
20070150389 Aamodt et al. Jun 2007 A1
20070168861 Bell et al. Jul 2007 A1
20070174228 Folting et al. Jul 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070186173 Both et al. Aug 2007 A1
20070192729 Downs Aug 2007 A1
20070220119 Himmelstein Sep 2007 A1
20070233647 Rawat et al. Oct 2007 A1
20070239746 Masselle et al. Oct 2007 A1
20070256043 Peters et al. Nov 2007 A1
20070282522 Geelen Dec 2007 A1
20070282627 Greenstein et al. Dec 2007 A1
20070283259 Barry et al. Dec 2007 A1
20070294235 Millett Dec 2007 A1
20070299795 Macbeth et al. Dec 2007 A1
20070300174 Macbeth et al. Dec 2007 A1
20070300185 Macbeth et al. Dec 2007 A1
20080004929 Raffel et al. Jan 2008 A9
20080005235 Hegde et al. Jan 2008 A1
20080033777 Shukoor Feb 2008 A1
20080034307 Cisler et al. Feb 2008 A1
20080034314 Louch et al. Feb 2008 A1
20080052291 Bender Feb 2008 A1
20080059312 Gern et al. Mar 2008 A1
20080059539 Chin et al. Mar 2008 A1
20080065460 Raynor Mar 2008 A1
20080077530 Banas et al. Mar 2008 A1
20080097748 Haley et al. Apr 2008 A1
20080104091 Chin May 2008 A1
20080126389 Mush et al. May 2008 A1
20080133736 Wensley et al. Jun 2008 A1
20080148140 Nakano Jun 2008 A1
20080155547 Weber et al. Jun 2008 A1
20080163075 Beck et al. Jul 2008 A1
20080183593 Dierks Jul 2008 A1
20080195948 Bauer Aug 2008 A1
20080209318 Allsop et al. Aug 2008 A1
20080216022 Lorch et al. Sep 2008 A1
20080222192 Hughes Sep 2008 A1
20080256014 Gould et al. Oct 2008 A1
20080256429 Penner et al. Oct 2008 A1
20080270597 Tenenti Oct 2008 A1
20080282189 Hofmann et al. Nov 2008 A1
20080295038 Helfman et al. Nov 2008 A1
20080301237 Parsons Dec 2008 A1
20090006171 Blatchley et al. Jan 2009 A1
20090006283 Labrie et al. Jan 2009 A1
20090007157 Ward et al. Jan 2009 A1
20090013244 Cudich et al. Jan 2009 A1
20090019383 Riley et al. Jan 2009 A1
20090024944 Louch et al. Jan 2009 A1
20090043814 Faris et al. Feb 2009 A1
20090044090 Gur et al. Feb 2009 A1
20090048896 Anandan Feb 2009 A1
20090049372 Goldberg Feb 2009 A1
20090075694 Kim Mar 2009 A1
20090077164 Phillips et al. Mar 2009 A1
20090077217 McFarland et al. Mar 2009 A1
20090083140 Phan Mar 2009 A1
20090094514 Dargahi et al. Apr 2009 A1
20090113310 Appleyard et al. Apr 2009 A1
20090129596 Chavez et al. May 2009 A1
20090132331 Cartledge et al. May 2009 A1
20090132470 Vignet May 2009 A1
20090150813 Chang et al. Jun 2009 A1
20090174680 Anzures et al. Jul 2009 A1
20090192787 Roon Jul 2009 A1
20090198715 Barbarek Aug 2009 A1
20090222760 Halverson et al. Sep 2009 A1
20090248710 McCormack et al. Oct 2009 A1
20090256972 Ramaswamy et al. Oct 2009 A1
20090262690 Breuer et al. Oct 2009 A1
20090271696 Bailor et al. Oct 2009 A1
20090276692 Rosner Nov 2009 A1
20090292690 Culbert Nov 2009 A1
20090313201 Huelsman et al. Dec 2009 A1
20090313537 Fu et al. Dec 2009 A1
20090313570 Po et al. Dec 2009 A1
20090319623 Srinivasan et al. Dec 2009 A1
20090319882 Morrison et al. Dec 2009 A1
20090327240 Meehan et al. Dec 2009 A1
20090327301 Lees et al. Dec 2009 A1
20090327851 Raposo Dec 2009 A1
20090327875 Kinkoh Dec 2009 A1
20100017699 Farrell et al. Jan 2010 A1
20100031135 Naghshin et al. Feb 2010 A1
20100070845 Facemire et al. Mar 2010 A1
20100070895 Messer Mar 2010 A1
20100082705 Ramesh et al. Apr 2010 A1
20100083164 Martin et al. Apr 2010 A1
20100088636 Yerkes et al. Apr 2010 A1
20100095219 Stachowiak et al. Apr 2010 A1
20100095298 Seshadrinathan et al. Apr 2010 A1
20100100427 McKeown et al. Apr 2010 A1
20100100463 Molotsi et al. Apr 2010 A1
20100114926 Agrawal et al. May 2010 A1
20100149005 Yoon et al. Jun 2010 A1
20100174678 Massand Jul 2010 A1
20100205521 Folting Aug 2010 A1
20100228752 Folting et al. Sep 2010 A1
20100241477 Nylander et al. Sep 2010 A1
20100241948 Andeen et al. Sep 2010 A1
20100241968 Tarara et al. Sep 2010 A1
20100241972 Spataro et al. Sep 2010 A1
20100241990 Gabriel et al. Sep 2010 A1
20100251090 Chamberlain et al. Sep 2010 A1
20100251386 Gilzean et al. Sep 2010 A1
20100257015 Molander Oct 2010 A1
20100262625 Pittenger Oct 2010 A1
20100268705 Douglas et al. Oct 2010 A1
20100268773 Hunt et al. Oct 2010 A1
20100287163 Sridhar et al. Nov 2010 A1
20100287221 Battepati et al. Nov 2010 A1
20100313119 Baldwin et al. Dec 2010 A1
20100324964 Callanan et al. Dec 2010 A1
20100332973 Kloiber et al. Dec 2010 A1
20110010340 Hung et al. Jan 2011 A1
20110016432 Helfman Jan 2011 A1
20110028138 Davies-Moore et al. Feb 2011 A1
20110047484 Mount et al. Feb 2011 A1
20110055177 Chakra et al. Mar 2011 A1
20110066933 Ludwig Mar 2011 A1
20110071869 O'Brien et al. Mar 2011 A1
20110106636 Spear et al. May 2011 A1
20110119352 Perov et al. May 2011 A1
20110154192 Yang et al. Jun 2011 A1
20110179371 Kopycinski et al. Jul 2011 A1
20110205231 Hartley et al. Aug 2011 A1
20110208324 Fukatsu Aug 2011 A1
20110208732 Melton et al. Aug 2011 A1
20110209150 Hammond et al. Aug 2011 A1
20110219321 Gonzalez Veron et al. Sep 2011 A1
20110225525 Chasman et al. Sep 2011 A1
20110231273 Buchheit Sep 2011 A1
20110238716 Amir et al. Sep 2011 A1
20110258040 Gnanasambandam Oct 2011 A1
20110288900 McQueen et al. Nov 2011 A1
20110289397 Eastmond et al. Nov 2011 A1
20110289439 Jugel Nov 2011 A1
20110298618 Stahl et al. Dec 2011 A1
20110302003 Shirish et al. Dec 2011 A1
20120029962 Podgurny et al. Feb 2012 A1
20120035974 Seybold Feb 2012 A1
20120036423 Haynes et al. Feb 2012 A1
20120036462 Schwartz et al. Feb 2012 A1
20120050802 Masuda Mar 2012 A1
20120066587 Zhou et al. Mar 2012 A1
20120072821 Bowling Mar 2012 A1
20120079408 Rohwer Mar 2012 A1
20120081762 Yamada Apr 2012 A1
20120084798 Reeves et al. Apr 2012 A1
20120086716 Reeves et al. Apr 2012 A1
20120086717 Liu Apr 2012 A1
20120089610 Agrawal et al. Apr 2012 A1
20120089914 Holt et al. Apr 2012 A1
20120089992 Reeves et al. Apr 2012 A1
20120096389 Flam et al. Apr 2012 A1
20120096392 Ording et al. Apr 2012 A1
20120102432 Breedvelt-Schouten et al. Apr 2012 A1
20120102543 Kohli et al. Apr 2012 A1
20120110515 Abramoff et al. May 2012 A1
20120116834 Pope et al. May 2012 A1
20120116835 Pope et al. May 2012 A1
20120124749 Lewman May 2012 A1
20120130907 Thompson et al. May 2012 A1
20120131445 Oyarzabal et al. May 2012 A1
20120151173 Shirley et al. Jun 2012 A1
20120158744 Tseng et al. Jun 2012 A1
20120192050 Campbell et al. Jul 2012 A1
20120198322 Gulwani et al. Aug 2012 A1
20120210252 Fedoseyeva et al. Aug 2012 A1
20120215574 Driessnack et al. Aug 2012 A1
20120215578 Swierz, III et al. Aug 2012 A1
20120229867 Takagi Sep 2012 A1
20120233150 Naim et al. Sep 2012 A1
20120233533 Yücel et al. Sep 2012 A1
20120234907 Clark et al. Sep 2012 A1
20120236368 Uchida et al. Sep 2012 A1
20120239454 Taix et al. Sep 2012 A1
20120244891 Appleton Sep 2012 A1
20120246170 Lantorno Sep 2012 A1
20120254252 Jin et al. Oct 2012 A1
20120254770 Ophir Oct 2012 A1
20120260190 Berger et al. Oct 2012 A1
20120278117 Nguyen et al. Nov 2012 A1
20120284197 Strick et al. Nov 2012 A1
20120297307 Rider et al. Nov 2012 A1
20120300931 Ollikainen et al. Nov 2012 A1
20120303262 Alam et al. Nov 2012 A1
20120304098 Kuulusa Nov 2012 A1
20120311496 Cao et al. Dec 2012 A1
20120311672 Connor et al. Dec 2012 A1
20120324348 Rounthwaite Dec 2012 A1
20130015954 Thorne et al. Jan 2013 A1
20130018952 McConnell et al. Jan 2013 A1
20130018953 McConnell et al. Jan 2013 A1
20130018960 Knysz et al. Jan 2013 A1
20130024418 Strick et al. Jan 2013 A1
20130024760 Vogel et al. Jan 2013 A1
20130036369 Mitchell et al. Feb 2013 A1
20130041958 Post et al. Feb 2013 A1
20130054514 Barrett-Kahn et al. Feb 2013 A1
20130055113 Chazin et al. Feb 2013 A1
20130059598 Miyagi et al. Mar 2013 A1
20130063490 Zaman et al. Mar 2013 A1
20130086460 Folting et al. Apr 2013 A1
20130090969 Rivere Apr 2013 A1
20130097490 Kotler et al. Apr 2013 A1
20130103417 Seto et al. Apr 2013 A1
20130104035 Wagner et al. Apr 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117268 Smith et al. May 2013 A1
20130159832 Ingargiola et al. Jun 2013 A1
20130159907 Brosche et al. Jun 2013 A1
20130179209 Milosevich Jul 2013 A1
20130211866 Gordon et al. Aug 2013 A1
20130212197 Karlson Aug 2013 A1
20130212234 Bartlett et al. Aug 2013 A1
20130215475 Noguchi Aug 2013 A1
20130238363 Ohta et al. Sep 2013 A1
20130238968 Barrus Sep 2013 A1
20130246384 Victor Sep 2013 A1
20130262527 Hunter Oct 2013 A1
20130268331 Bitz et al. Oct 2013 A1
20130297468 Hirsch et al. Nov 2013 A1
20130307997 O'Keefe et al. Nov 2013 A1
20130318424 Boyd Nov 2013 A1
20130339051 Dobrean Dec 2013 A1
20140002863 Hasegawa et al. Jan 2014 A1
20140006326 Bazanov Jan 2014 A1
20140012616 Moshenek Jan 2014 A1
20140019842 Montagna et al. Jan 2014 A1
20140033307 Schmidtler Jan 2014 A1
20140043331 Makinen et al. Feb 2014 A1
20140046638 Peloski Feb 2014 A1
20140052749 Rissanen Feb 2014 A1
20140058801 Deodhar et al. Feb 2014 A1
20140059017 Chaney et al. Feb 2014 A1
20140068403 Bhargav et al. Mar 2014 A1
20140074545 Minder et al. Mar 2014 A1
20140075301 Mihara Mar 2014 A1
20140078557 Hasegawa et al. Mar 2014 A1
20140082525 Kass et al. Mar 2014 A1
20140095237 Ehrler et al. Apr 2014 A1
20140101527 Suciu Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140109012 Choudhary et al. Apr 2014 A1
20140111516 Hall et al. Apr 2014 A1
20140115515 Adams et al. Apr 2014 A1
20140115518 Abdukalykov et al. Apr 2014 A1
20140129960 Wang et al. May 2014 A1
20140136972 Rodgers et al. May 2014 A1
20140137003 Peters et al. May 2014 A1
20140137144 Järvenpääet al. May 2014 A1
20140172475 Olliphant et al. Jun 2014 A1
20140173401 Oshlag et al. Jun 2014 A1
20140181155 Homsany Jun 2014 A1
20140188748 Cavoue et al. Jul 2014 A1
20140195933 Rao DV Jul 2014 A1
20140214404 Kalia et al. Jul 2014 A1
20140215303 Grigorovitch et al. Jul 2014 A1
20140229816 Yakub Aug 2014 A1
20140240735 Salgado Aug 2014 A1
20140249877 Hull et al. Sep 2014 A1
20140257568 Czaja et al. Sep 2014 A1
20140278638 Kreuzkamp et al. Sep 2014 A1
20140278720 Taguchi Sep 2014 A1
20140280287 Ganti et al. Sep 2014 A1
20140280377 Frew Sep 2014 A1
20140281868 Vogel et al. Sep 2014 A1
20140281869 Yob Sep 2014 A1
20140289223 Colwell et al. Sep 2014 A1
20140304174 Scott et al. Oct 2014 A1
20140306837 Hauck, III Oct 2014 A1
20140310345 Megiddo et al. Oct 2014 A1
20140324497 Verma et al. Oct 2014 A1
20140324501 Davidow et al. Oct 2014 A1
20140325552 Evans et al. Oct 2014 A1
20140365938 Black et al. Dec 2014 A1
20140372856 Radakovitz et al. Dec 2014 A1
20140372932 Rutherford et al. Dec 2014 A1
20150032686 Kuchoor Jan 2015 A1
20150033131 Peev et al. Jan 2015 A1
20150033149 Kuchoor Jan 2015 A1
20150035918 Matsumoto et al. Feb 2015 A1
20150039387 Akahoshi et al. Feb 2015 A1
20150046209 Choe Feb 2015 A1
20150067556 Tibrewal et al. Mar 2015 A1
20150074721 Fishman et al. Mar 2015 A1
20150074728 Chai et al. Mar 2015 A1
20150088822 Raja et al. Mar 2015 A1
20150095752 Studer et al. Apr 2015 A1
20150106736 Torman et al. Apr 2015 A1
20150125834 Mendoza May 2015 A1
20150142676 McGinnis et al. May 2015 A1
20150142829 Lee et al. May 2015 A1
20150153943 Wang Jun 2015 A1
20150154660 Weald et al. Jun 2015 A1
20150169514 Sah et al. Jun 2015 A1
20150169531 Campbell et al. Jun 2015 A1
20150178657 Kleehammer et al. Jun 2015 A1
20150188964 Sharma et al. Jul 2015 A1
20150205830 Bastide et al. Jul 2015 A1
20150212717 Nair et al. Jul 2015 A1
20150220491 Cochrane et al. Aug 2015 A1
20150234887 Greene et al. Aug 2015 A1
20150242091 Lu et al. Aug 2015 A1
20150249864 Tang et al. Sep 2015 A1
20150261796 Gould et al. Sep 2015 A1
20150262121 Riel-Dalpe et al. Sep 2015 A1
20150278699 Danielsson Oct 2015 A1
20150281292 Murayama et al. Oct 2015 A1
20150295877 Roman Oct 2015 A1
20150310126 Steiner et al. Oct 2015 A1
20150317590 Karlson Nov 2015 A1
20150324453 Werner Nov 2015 A1
20150331846 Guggilla et al. Nov 2015 A1
20150363478 Haynes Dec 2015 A1
20150370540 Coslovi et al. Dec 2015 A1
20150370776 New Dec 2015 A1
20150370904 Joshi et al. Dec 2015 A1
20150378542 Saito et al. Dec 2015 A1
20150378711 Cameron et al. Dec 2015 A1
20150378979 Hirzel et al. Dec 2015 A1
20150379472 Gilmour et al. Dec 2015 A1
20160012111 Pattabhiraman et al. Jan 2016 A1
20160018962 Low et al. Jan 2016 A1
20160026939 Schiffer et al. Jan 2016 A1
20160027076 Jackson et al. Jan 2016 A1
20160035546 Platt et al. Feb 2016 A1
20160041736 Schulz Feb 2016 A1
20160055134 Sathish et al. Feb 2016 A1
20160055374 Zhang et al. Feb 2016 A1
20160063435 Shah et al. Mar 2016 A1
20160068960 Jung et al. Mar 2016 A1
20160078368 Kakhandiki et al. Mar 2016 A1
20160088480 Chen et al. Mar 2016 A1
20160092557 Stojanovic et al. Mar 2016 A1
20160098574 Bargagni Apr 2016 A1
20160117308 Haider et al. Apr 2016 A1
20160170586 Gallo Jun 2016 A1
20160173122 Akitomi et al. Jun 2016 A1
20160196310 Dutta Jul 2016 A1
20160210572 Shaaban et al. Jul 2016 A1
20160224532 Miller et al. Aug 2016 A1
20160224676 Miller et al. Aug 2016 A1
20160224939 Chen et al. Aug 2016 A1
20160231915 Nhan et al. Aug 2016 A1
20160232489 Skaaksrud Aug 2016 A1
20160246490 Cabral Aug 2016 A1
20160253982 Cheung et al. Sep 2016 A1
20160259856 Ananthapur et al. Sep 2016 A1
20160275150 Bournonnais et al. Sep 2016 A1
20160292206 Ruiz Velazquez et al. Oct 2016 A1
20160299655 Migos et al. Oct 2016 A1
20160308963 Kung Oct 2016 A1
20160321235 He et al. Nov 2016 A1
20160321604 Imaeda et al. Nov 2016 A1
20160335302 Teodorescu et al. Nov 2016 A1
20160335303 Madhalam et al. Nov 2016 A1
20160335604 Reminick et al. Nov 2016 A1
20160335731 Hall Nov 2016 A1
20160335903 Mendoza Nov 2016 A1
20160344828 Häusler et al. Nov 2016 A1
20160350950 Ritchie et al. Dec 2016 A1
20160381099 Keslin et al. Dec 2016 A1
20170017779 Huang et al. Jan 2017 A1
20170031967 Chavan et al. Feb 2017 A1
20170041296 Ford et al. Feb 2017 A1
20170052937 Sirven et al. Feb 2017 A1
20170061342 Lore et al. Mar 2017 A1
20170061360 Rucker et al. Mar 2017 A1
20170061820 Firoozbakhsh Mar 2017 A1
20170063722 Cropper et al. Mar 2017 A1
20170075557 Noble et al. Mar 2017 A1
20170076101 Kochhar et al. Mar 2017 A1
20170090734 Fitzpatrick Mar 2017 A1
20170090736 King et al. Mar 2017 A1
20170091337 Patterson Mar 2017 A1
20170093876 Feng et al. Mar 2017 A1
20170109499 Doshi et al. Apr 2017 A1
20170111327 Wu Apr 2017 A1
20170116552 Deodhar et al. Apr 2017 A1
20170124042 Campbell et al. May 2017 A1
20170124048 Campbell et al. May 2017 A1
20170124055 Radakovitz et al. May 2017 A1
20170124740 Campbell et al. May 2017 A1
20170126772 Campbell et al. May 2017 A1
20170132296 Ding May 2017 A1
20170132652 Kedzlie et al. May 2017 A1
20170139874 Chin May 2017 A1
20170139884 Bendig et al. May 2017 A1
20170139891 Ah-Soon et al. May 2017 A1
20170139992 Morin May 2017 A1
20170140047 Bendig et al. May 2017 A1
20170140219 King et al. May 2017 A1
20170153771 Chu Jun 2017 A1
20170161246 Klima Jun 2017 A1
20170177556 Fay Jun 2017 A1
20170177888 Arora et al. Jun 2017 A1
20170185575 Sood et al. Jun 2017 A1
20170185668 Convertino et al. Jun 2017 A1
20170200122 Edson et al. Jul 2017 A1
20170206366 Fay Jul 2017 A1
20170212924 Semlani et al. Jul 2017 A1
20170220813 Mullins et al. Aug 2017 A1
20170221072 AthuluruTlrumala et al. Aug 2017 A1
20170228421 Sharma et al. Aug 2017 A1
20170228445 Chiu et al. Aug 2017 A1
20170228460 Amel et al. Aug 2017 A1
20170229152 Loganathan et al. Aug 2017 A1
20170236081 Grady Smith et al. Aug 2017 A1
20170242921 Rota Aug 2017 A1
20170257517 Panda Sep 2017 A1
20170262786 Khasis Sep 2017 A1
20170270970 Ho et al. Sep 2017 A1
20170272316 Johnson et al. Sep 2017 A1
20170272331 Lissack Sep 2017 A1
20170277620 Kadioglu Sep 2017 A1
20170277669 Sekharan Sep 2017 A1
20170285879 Pilkington et al. Oct 2017 A1
20170285890 Dolman Oct 2017 A1
20170289619 Xu et al. Oct 2017 A1
20170301039 Dyer et al. Oct 2017 A1
20170315683 Boucher et al. Nov 2017 A1
20170315974 Kong et al. Nov 2017 A1
20170315979 Boucher et al. Nov 2017 A1
20170322963 Ramamurthi et al. Nov 2017 A1
20170324692 Zhou Nov 2017 A1
20170329479 Rauschenbach et al. Nov 2017 A1
20170351252 Kleifges et al. Dec 2017 A1
20170372442 Mejias Dec 2017 A1
20170374205 Panda Dec 2017 A1
20180011827 Avery et al. Jan 2018 A1
20180025084 Conlan et al. Jan 2018 A1
20180026954 Toepke et al. Jan 2018 A1
20180032492 Altshuller et al. Feb 2018 A1
20180032570 Miller et al. Feb 2018 A1
20180039651 Tobin et al. Feb 2018 A1
20180055434 Cheung et al. Mar 2018 A1
20180075104 Oberbreckling et al. Mar 2018 A1
20180075115 Murray et al. Mar 2018 A1
20180075413 Culver et al. Mar 2018 A1
20180075560 Thukral et al. Mar 2018 A1
20180081505 Ron et al. Mar 2018 A1
20180081863 Bathla Mar 2018 A1
20180081868 Willcock et al. Mar 2018 A1
20180088753 Viégas et al. Mar 2018 A1
20180088989 Nield et al. Mar 2018 A1
20180089299 Collins et al. Mar 2018 A1
20180095938 Monte Apr 2018 A1
20180096417 Cook et al. Apr 2018 A1
20180109760 Metter et al. Apr 2018 A1
20180121028 Kuscher May 2018 A1
20180121994 Matsunaga et al. May 2018 A1
20180128636 Zhou May 2018 A1
20180129651 Latvala et al. May 2018 A1
20180157455 Troy et al. Jun 2018 A1
20180157467 Stachura Jun 2018 A1
20180157468 Stachura Jun 2018 A1
20180157633 He et al. Jun 2018 A1
20180173715 Dunne Jun 2018 A1
20180181650 Komatsuda et al. Jun 2018 A1
20180181716 Mander et al. Jun 2018 A1
20180189734 Newhouse et al. Jul 2018 A1
20180210936 Reynolds et al. Jul 2018 A1
20180225270 Bhide et al. Aug 2018 A1
20180260371 Theodore et al. Sep 2018 A1
20180260435 Xu Sep 2018 A1
20180262705 Park et al. Sep 2018 A1
20180276417 Cerezo Sep 2018 A1
20180285918 Staggs Oct 2018 A1
20180293217 Callaghan Oct 2018 A1
20180293587 Oda Oct 2018 A1
20180293669 Jackson et al. Oct 2018 A1
20180329930 Eberlein et al. Nov 2018 A1
20180330320 Kohli Nov 2018 A1
20180357305 Kinast et al. Dec 2018 A1
20180365429 Segal Dec 2018 A1
20180367484 Rodriguez et al. Dec 2018 A1
20180373434 Switzer et al. Dec 2018 A1
20180373757 Schukovets et al. Dec 2018 A1
20190005094 Yi et al. Jan 2019 A1
20190011310 Turnbull et al. Jan 2019 A1
20190012342 Cohn Jan 2019 A1
20190034395 Curry et al. Jan 2019 A1
20190036989 Eirinberg et al. Jan 2019 A1
20190042628 Rajpara Feb 2019 A1
20190050445 Griffith et al. Feb 2019 A1
20190050466 Kim et al. Feb 2019 A1
20190050812 Boileau Feb 2019 A1
20190056856 Simmons et al. Feb 2019 A1
20190065545 Hazel et al. Feb 2019 A1
20190068703 Vora et al. Feb 2019 A1
20190073350 Shiotani Mar 2019 A1
20190095413 Davis et al. Mar 2019 A1
20190097909 Puri et al. Mar 2019 A1
20190108046 Spencer-Harper et al. Apr 2019 A1
20190113935 Kuo et al. Apr 2019 A1
20190114308 Hancock Apr 2019 A1
20190114589 Voss et al. Apr 2019 A1
20190123924 Embiricos et al. Apr 2019 A1
20190130611 Black et al. May 2019 A1
20190138583 Silk et al. May 2019 A1
20190138588 Silk et al. May 2019 A1
20190138653 Roller et al. May 2019 A1
20190147030 Stein et al. May 2019 A1
20190155821 Dirisala May 2019 A1
20190179501 Seeley et al. Jun 2019 A1
20190199823 Underwood et al. Jun 2019 A1
20190208058 Dvorkin et al. Jul 2019 A1
20190213557 Dotan-Cohen et al. Jul 2019 A1
20190220161 Loftus et al. Jul 2019 A1
20190236188 McKenna Aug 2019 A1
20190243879 Harley et al. Aug 2019 A1
20190251884 Burns et al. Aug 2019 A1
20190258461 Li et al. Aug 2019 A1
20190258706 Li et al. Aug 2019 A1
20190286839 Mutha et al. Sep 2019 A1
20190306009 Makovsky et al. Oct 2019 A1
20190324840 Malamut et al. Oct 2019 A1
20190325012 Delaney et al. Oct 2019 A1
20190327294 Subramani Nadar et al. Oct 2019 A1
20190340550 Denger et al. Nov 2019 A1
20190347077 Huebra Nov 2019 A1
20190361879 Rogynskyy et al. Nov 2019 A1
20190361971 Zenger et al. Nov 2019 A1
20190364009 Joseph et al. Nov 2019 A1
20190371442 Schoenberg Dec 2019 A1
20190377791 Abou Mahmoud et al. Dec 2019 A1
20190391707 Ristow et al. Dec 2019 A1
20200005248 Gerzi et al. Jan 2020 A1
20200005295 Murphy Jan 2020 A1
20200012629 Lereya et al. Jan 2020 A1
20200019548 Agnew et al. Jan 2020 A1
20200019595 Azua Jan 2020 A1
20200026352 Wang et al. Jan 2020 A1
20200026397 Wohlstadter et al. Jan 2020 A1
20200042648 Rao Feb 2020 A1
20200050696 Mowatt et al. Feb 2020 A1
20200053176 Jimenez et al. Feb 2020 A1
20200125574 Ghoshal et al. Apr 2020 A1
20200134002 Tung et al. Apr 2020 A1
20200142546 Breedvelt-Schouten et al. May 2020 A1
20200151630 Shakhnovich May 2020 A1
20200159558 Bak et al. May 2020 A1
20200175094 Palmer Jun 2020 A1
20200176089 Jones et al. Jun 2020 A1
20200192785 Chen Jun 2020 A1
20200193388 Tran-Kiem et al. Jun 2020 A1
20200247661 Rao et al. Aug 2020 A1
20200265112 Fox et al. Aug 2020 A1
20200279315 Manggala Sep 2020 A1
20200293616 Nelson et al. Sep 2020 A1
20200301678 Burman et al. Sep 2020 A1
20200301902 Maloy et al. Sep 2020 A1
20200310835 Momchilov Oct 2020 A1
20200326824 Alonso Oct 2020 A1
20200327244 Blass et al. Oct 2020 A1
20200334019 Bosworth et al. Oct 2020 A1
20200348809 Drescher Nov 2020 A1
20200349320 Owens Nov 2020 A1
20200356740 Principato Nov 2020 A1
20200356873 Nawrocke et al. Nov 2020 A1
20200374146 Chhabra et al. Nov 2020 A1
20200380212 Butler et al. Dec 2020 A1
20200380449 Choi Dec 2020 A1
20200387664 Kusumura et al. Dec 2020 A1
20200401581 Eubank et al. Dec 2020 A1
20200409949 Saxena et al. Dec 2020 A1
20200410395 Ray et al. Dec 2020 A1
20210014136 Rath Jan 2021 A1
20210019287 Prasad et al. Jan 2021 A1
20210021603 Gibbons Jan 2021 A1
20210034058 Subramanian et al. Feb 2021 A1
20210035069 Parikh Feb 2021 A1
20210042796 Khoury et al. Feb 2021 A1
20210049524 Nachum et al. Feb 2021 A1
20210049555 Shor Feb 2021 A1
20210055955 Yankelevich et al. Feb 2021 A1
20210056509 Lindy Feb 2021 A1
20210065203 Billigmeier et al. Mar 2021 A1
20210072883 Migunova et al. Mar 2021 A1
20210073526 Zeng et al. Mar 2021 A1
20210084120 Fisher et al. Mar 2021 A1
20210124749 Suzuki et al. Apr 2021 A1
20210124872 Lereya Apr 2021 A1
20210136027 Barbitta et al. May 2021 A1
20210149553 Lereya et al. May 2021 A1
20210149688 Newell et al. May 2021 A1
20210149925 Mann et al. May 2021 A1
20210150489 Haramati et al. May 2021 A1
20210165782 Deshpande et al. Jun 2021 A1
20210166196 Lereya et al. Jun 2021 A1
20210166339 Mann et al. Jun 2021 A1
20210173682 Chakraborti et al. Jun 2021 A1
20210174006 Stokes Jun 2021 A1
20210192126 Gehrmann et al. Jun 2021 A1
20210248311 Helft et al. Aug 2021 A1
20210257065 Mander et al. Aug 2021 A1
20210264220 Wei et al. Aug 2021 A1
20210326519 Lin et al. Oct 2021 A1
20210328888 Rath Oct 2021 A1
20210342785 Mann et al. Nov 2021 A1
20210365446 Srivastava et al. Nov 2021 A1
20210397585 Seward Dec 2021 A1
20220099454 Decrop Mar 2022 A1
20220121325 Roberts Apr 2022 A1
20220121478 Chivukula et al. Apr 2022 A1
20220221591 Smith et al. Jul 2022 A1
20220291666 Cella et al. Sep 2022 A1
20230153651 Bi et al. May 2023 A1
20230316382 Faricy et al. Oct 2023 A1
20230419161 Dines Dec 2023 A1
20240046142 Marks et al. Feb 2024 A1
20240053727 Timisescu et al. Feb 2024 A1
Foreign Referenced Citations (25)
Number Date Country
2828011 Sep 2012 CA
103064833 Apr 2013 CN
107123424 Sep 2017 CN
107422666 Dec 2017 CN
107623596 Jan 2018 CN
107885656 Apr 2018 CN
108717428 Oct 2018 CN
112929172 Jun 2021 CN
3443466 Dec 2021 EP
20150100760 Sep 2015 KR
20220016276 Feb 2022 KR
WO 2004100015 Nov 2004 WO
WO 2006116580 Nov 2006 WO
WO 2008109541 Sep 2008 WO
2014088393 Jun 2014 WO
WO 2017202159 Nov 2017 WO
2018023798 Feb 2018 WO
2018042424 Mar 2018 WO
2020139865 Jul 2020 WO
WO 2020187408 Sep 2020 WO
WO 2021096944 May 2021 WO
WO 2021144656 Jul 2021 WO
WO 2021161104 Aug 2021 WO
WO 2021220058 Nov 2021 WO
2022153122 Jul 2022 WO
Non-Patent Literature Citations (42)
Entry
D'Elessio et al., Monday.com Walkthrough 2018\All Features, Platforms & Thoughts, Mar. 1, 2018, pp. 1-55, 2018.
Rordigo et al., Project Management with Monday.com: a 101 Introduction; Jul. 22, 2019, pp. 1-21, 2019.
International Search Report and Written Opinion of the International Searching Authority in PCT/IB2020/000658, mailed Nov. 11, 2020 (12 pages).
International Search Report in PCT/IB2020/000974, mailed May 3, 2021 (19 pages).
International Search Report in PCT/1B2021/000090 dated Jul. 27, 2021.
ShowMyPC, “Switch Presenter While Using ShowMyPC”; web archive.org; Aug. 20, 2016.
International Search Report and Written Opinion of the International Search Authority in PCT/1B2020/000024, mailed May 3, 2021 (13 pages).
“Pivot table—Wikipedia”; URL: https://en.wikepedia .org/w/index.php?title=Pivot_table&oldid=857163289, originally retrieve on Oct. 23, 2019; retrieved on Jul. 16, 2021.
Vishal Singh, “A Theoretical Framework of a BIM-based Multi-Disciplinary Collaboration Platform”, Nov. 5, 2020, Automation in Construction, 20 (2011), pp. 134-144 (Year: 2011).
Edward A. Stohr, Workflow Automation: Overview and Research Issues, 2001, Information Systems Frontiers 3:3, pp. 281-296 (Year: 2001).
International Search Report and Written Opinion of the International Search Authority in PCT/1B2021/000297, mailed Oct. 12, 2021 (20 pages).
Dapulse.com “features”.extracted from web.archive.or/web/2014091818421/https://dapulse.com/features; Sep. 2014 (Year: 2014).
Stephen Larson et al., Introducing Data Mining Concepts Using Microsoft Excel's Table Analysis Tools, Oct. 2015, [Retrieved on Nov. 19, 2021], Retrieved from the internet: <URL: https://dl.acm.org/doi/pdf/10.5555/2831373.2831394> 3 Pages (127-129) (Year: 2015).
Isaiah Pinchas et al., Lexical Analysis Tool, May 2004, [Retrieved on Nov. 19, 2021], Retrieved from the internet: <URL: https:// dl.acm.org/doi/pdf/10.1145/997140.997147> 9 Pages (66-74) (Year: 2004).
Sajjad Bahrebar et al., “A Novel Type-2 Fuzzy Logic for Improved Risk Analysis of Proton Exchange Membrane Fuel Cells in Marine Power Systems Application”, Energies, 11, 721, pp. 1-16, Mar. 22, 2018.
Pedersen et al., “Tivoli: an electronic whiteboard for informal workgroup meetings”, Conference on Human Factors in Computing Systems: Proceedings of the Interact '93 and CHI '93 conference on Human factors in computing systems; Apr. 24-29, 1993:391-398. (Year 1993).
Kollmann, Franz, “Realizing Fine-Granular Read and Write Rights on Tree Structured Documents.” in The Second International Conference on Availability, Reliability and Security (ARES'07), pp. 517-523. IEEE, 2007. (Year: 2007).
Baarslag, “Negotiation as an Interaction Mechanism for Deciding App Permissions.” In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2012-2019. 2016 (Year: 2016).
Peltier, “Clustered and Stacked col. and Bar Charts”, Aug. 2011, Peltier Technical Services, Inc., pp. 1-128; (Year: 2011).
Beate List, “An Evaluation of Conceptual Business Process Modelling Languages”, 2006, SAC'06, Apr. 23-27, pp. 1532-1539 (Year: 2006).
“Demonstracion en espanol de Monday.com”, published Feb. 20, 2019. https://www.youtube.com/watch?v=z0qydTgof1A (Year: 2019).
Desmedt, Yvo, and Arash Shaghaghi, “Function-Based Access Control (FBAC) From Access Control Matrix to Access Control Tensor.” In Proceedings of the 8th ACM CCS International Workshop on Managing Insider Security Threats, pp. 89-92. (2016).
Anupam, V., et al., “Personalizing the Web Using Site Descriptions”, Proceedings of the Tenth International Workshop on Database and Expert Systems Applications, ISBN: 0-7695-0281-4, DOI: 10.1109/DEXA.1999.795275, Jan. 1, 1999, pp. 732-738. (Year: 1999).
Gutwin, C. et al., “Supporting Informal Collaboration in Shared-Workspace Groupware”, J. Univers. Comput. Sci., 14(9), 1411-1434 (2008).
Barai, S., et al., “Image Annotation System Using Visual and Textual Features”, In: Proceedings of the 16th International Conference on Distributed Multi-media Systems, pp. 289-296 (2010).
B. Ionescu, C. Gadea, B. Solomon, M. Trifan, D. Ionescu and V. Stoicu-Tivadar, “Achat-centric collaborative environment for web-based real-time collaboration,” 2015 IEEE 10th Jubilee International Symposium on Applied Computational Intelligence and Informatics, Timisoara, Romania, 2015, pp. 105-110 (Year: 2015).
Susanne Hupfer, Li-Te Cheng, Steven Ross, and John Patterson. 2004. Introducing collaboration into an application development environment. In Proceedings of the 2004 ACM conference on Computer supported cooperative work (CSCW '04). Association for Computing Machinery, New York, NY, USA, 21-24 (Year: 2004).
Abor Jr, C., “Low-Code and No-Code AI: New AI Development—What is code anymore?!?!” (as retrieved from https://www.linkedin.com/pulse/ low-code-no-code-ai-new-development-what-code-anymore-c-l-abor-jr); Jul. 15, 2023 (Year: 2023).
Aylward, Grant, “Drag-and-Drop AI Enables Digital Workforce Deployment at Scale Share” (as retrieved from https://www.blueprism.com/resources/ blog/drag-and-drop-ai-enables-digital-workforce-deployment-at-scale/); Mar. 19, 202 (Year: 2020).
Chen et al., “Artificial Intelligence in Education: A Review,” IEEEAccess vol. 8, pp. 75264-75278 (Year: 2020).
Dapulse.com, “High Level Overview”, Extracted from https://web.archive.org/web/20161104170936/https://dapulse.com (Year: 2016).
Donath, “Interfaces Make Meaning” chapter from The Social Machine: Designs for Living Online, pp. 41-76, copyright 2014. (Year: 2014).
Dorn et al., “Efficient Full-Field Vibration Measurements and Operational Modal Analysis Using Neuromorphic Event-Based Imaging,” Journal of Engineering Mechanics, vol. 144, No. 7, Jul. 1, 2018 (Year: 2018).
Freund, K., “SiMa.ai Creates Drag-And-Drop Platform For Building AI Workflows” (as retrieved from https://www.forbes.com/sites/karlfreund/2023/09/12/simaal-creates-drag-and-drop-platform-for-building-ai-workflows/?sh=789de8466046); Sep. 12, 2023 (Year: 2023).
Monday.com et al., “Basic Walkthrough”, https://www.youtube.com/watch?v=VpbgWyPf74g; Aug. 9, 2019. (Year: 2019).
Sreenath et al., “Agent-based service selection,” Journal of Web Semantics 1.3, pp. 261-279 (Year: 2004).
Stancu et al., “SecCollab-Improving Confidentiality for Existing Cloud-Based Collaborative Editors.” In 2017 21st International Conferences on Control Systems and Computer Scient (CSCS), pp. 324-331. IEEE,2017. (Year: 2017).
“Using Filters in Overview,” published Mar. 7, 2017. https://www.youtube.com/watch?v=hycANhz7gww (Year: 2017).
Wilson et al., “Beyond Social Graphs: User Interactions in Online Social Networks and their Implications,” ACM Transactions on the Web, vol. 6, No. 4, Article 17, Nov. 2012 (Year: 2012).
Zhang et al., “Integrating semantic NLP and logic reasoning into a unified system for fully-automated code checking,” Automation in Construction, vol. 73, 2017, pp. 45-57, ISSN 0926-5805, https://doi.org/10.1016/j.autcon.2016.08.027.
Zhenjiang et al., “Asynchronous Event-Based Visual Shape Tracking for Stable Haptic Feedback in Microrobotics,” IEEE Transactions on Robotics, IEEE Service Center, Piscataway, NJ, vol. 28, No. 5, Oct. 1, 2012, pp. 1081-1089 (Year: 2012).
Ziheng, G., “Advanced Cyberinfrastructure for Managing Hybrid Geoscientific AI Workflows” (Year: 2019).
Related Publications (1)
Number Date Country
20230333728 A1 Oct 2023 US
Provisional Applications (1)
Number Date Country
63273453 Oct 2021 US