Digital processing systems and methods for navigating and viewing displayed content

Information

  • Patent Grant
  • 11741071
  • Patent Number
    11,741,071
  • Date Filed
    Wednesday, December 28, 2022
    3 years ago
  • Date Issued
    Tuesday, August 29, 2023
    2 years ago
Abstract
System and methods are provided for performing table navigation operations on a display, wherein the table comprises rows and columns and at least one cell that contains more information than presented on the display. Disclosed embodiments include receiving a scroll signal for scrolling the table, wherein the scroll signal results from a motion of a user on the display; in response to a vertical component in the motion, causing the table to scroll vertically; in response to a horizontal component in the motion, causing the table to scroll horizontally; receiving an enhancing scroll signal resulting from an enhancing motion for revealing hidden information within the at least one cell on the table; and in response to the enhancing scroll signal, revealing at least a portion of the hidden information within the at least one cell.
Description
TECHNICAL FIELD

The present disclosure relates generally to information display method and content navigation. More specifically, this disclosure relates to systems and methods for navigating and viewing displayed content, such as by performing table navigation and manipulation operations on a computer display. Consistent with the disclosed embodiments, non-transitory computer-readable storage media may store program instructions, which may be executable by at least one processing device and perform any of the steps and/or methods described herein.


BACKGROUND

Operation of modern enterprises can be complicated and time-consuming. In many cases, managing the operation of a single project requires integration of several employees, departments, and other resources of the entity. To manage the challenging operation, project management software applications may be used. Such software applications allow a user to organize, plan, and manage resources by providing project-related information in order to optimize the time and resources spent on each project.


In addition, project management software applications often rely on the use of data tables to illustrate the plurality of tasks required by a project, the distribution of tasks, or the progress of tasks. Most of the time, by convention, the width and height of all cells in a table displayed on any display device are the same. If a cell contains a particularly large piece of information, the entire piece of information may not be displayed, which impacts the readability and practicality of the table. This is notably true for mobile devices having particularly limited display dimensions.


Changing the dimensions of such a cell usually requires specific operations that take time, which has an impact on the efficiency of the table user. Therefore, it is essential to be able to quickly display all the information included in the cells on a table, while a user navigates through displayed content. The present disclosure describes solutions to address or overcome one or more of the above-stated problems, among other drawbacks in traditional systems.


SUMMARY

Embodiments consistent with the present disclosure provide systems and methods for performing table navigation operations on a display. The disclosed embodiments may be implemented using a combination of conventional hardware and software as well as specialized hardware and software.


In an embodiment, a non-transitory computer-readable medium containing instructions that, when executed, cause at least one processor to perform table navigation operations on a display, wherein the table comprises rows and columns and at least one cell that contains more information than presented on the display is disclosed. The operations may comprise receiving a scroll signal for scrolling the table, wherein the scroll signal results from a motion of a user on the display; in response to a vertical component in the motion, causing the table to scroll vertically; in response to a horizontal component in the motion, causing the table to scroll horizontally; receiving an enhancing scroll signal resulting from an enhancing motion for revealing hidden information within the at least one cell on the table; and in response to the enhancing scroll signal, revealing at least a portion of the hidden information within the at least one cell.


In another embodiment, a method for performing table navigation operations on a display, wherein the table comprises rows and columns and at least one cell that contains more information than it presents is disclosed. The method may comprise: receiving a scroll signal for scrolling the table, wherein the scroll signal results from a motion of a user on the display; in response to a vertical component in the motion, causing the table to scroll vertically; in response to a horizontal component in the motion, causing the table to scroll horizontally; receiving an enhancing scroll signal resulting from an enhancing motion for revealing hidden information within the at least one cell on the table; and in response to the enhancing scroll signal, revealing at least a portion of the hidden information within the at least one cell.


In yet another embodiment, a system for performing table navigation operations on a display, wherein the table comprises rows and columns and at least one cell that contains more information than it presents is disclosed. The system may comprise a memory storing instructions and at least one processor that executes the stored instructions to receive a scroll signal for scrolling the table, wherein the scroll signal results from a motion of a user on the display; in response to a vertical component in the motion, cause the table to scroll vertically; in response to a horizontal component in the motion, cause the table to scroll horizontally; receive an enhancing scroll signal resulting from an enhancing motion for revealing hidden information within the at least one cell on the table; and in response to the enhancing scroll signal, reveal at least a portion of the hidden information within the at least one cell.


Other advantages of the invention are set forth in the appended claims which form an integral part hereof. The foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the claims.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various disclosed embodiments. In the drawings:



FIG. 1 is a flowchart of an exemplary process for performing table navigation operations, consistent with disclosed embodiments.



FIG. 2 is an illustration of an exemplary table presented on a display comprising rows and columns and at least one cell that contains more information than presented on the display, consistent with disclosed embodiments.



FIGS. 3A-3B are illustrations of exemplary tables presented on a display comprising rows and columns after reception of an enhancing scroll signal, the at least one cell that initially contained more information than presented on the display is in an enhanced form, consistent with disclosed embodiments.



FIG. 4 is a block diagram of an exemplary computing device which may be employed in connection with embodiments of the present disclosure.



FIG. 5 is a block diagram of an exemplary computing architecture for collaborative work systems, consistent with embodiments of the present disclosure.





DETAILED DESCRIPTION

Disclosed embodiments provide new and improved techniques for navigating and viewing displayed content, including techniques for dynamically modifying display of table content to display all contents of a table cell, in response to detection of an enhanced scroll signal.


Exemplary embodiments are described with reference to the accompanying drawings. The figures are not necessarily drawn to scale. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It should also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.


In the following description, various working examples are provided for illustrative purposes. However, is to be understood the present disclosure may be practiced without one or more of these details.


Throughout, this disclosure mentions “disclosed embodiments,” which refer to examples of inventive ideas, concepts, and/or manifestations described herein. Many related and unrelated embodiments are described throughout this disclosure. The fact that some “disclosed embodiments” are described as exhibiting a feature or characteristic does not mean that other disclosed embodiments necessarily share that feature or characteristic.


This disclosure presents various mechanisms for collaborative work systems. Such systems may involve software that enables multiple users to work collaboratively. By way of one example, workflow management software may enable various members of a team to cooperate via a common online platform. It is intended that one or more aspects of any mechanism may be combined with one or more aspect of any other mechanisms, and such combinations are within the scope of this disclosure.


This disclosure is constructed to provide a basic understanding of a few exemplary embodiments with the understanding that features of the exemplary embodiments may be combined with other disclosed features or may be incorporated into platforms or embodiments not described herein while still remaining within the scope of this disclosure. For convenience, and form of the word “embodiment” as used herein is intended to refer to a single embodiment or multiple embodiments of the disclosure.


Certain embodiments disclosed herein include devices, systems, and methods for collaborative work systems that may allow a user to interact with information in real time. To avoid repetition, the functionality of some embodiments is described herein solely in connection with a processor or at least one processor. It is to be understood that such exemplary descriptions of functionality apply equally to methods and computer readable media and constitutes a written description of systems, methods, and computer readable media. The underlying platform may allow a user to structure a systems, methods, or computer readable media in many ways using common building blocks, thereby permitting flexibility in constructing a product that suits desired needs. This may be accomplished through the use of boards. A board may be a table configured to contain items (e.g., individual items presented in horizontal rows) defining objects or entities that are managed in the platform (task, project, client, deal, etc.). Unless expressly noted otherwise, the terms “board” and “table” may be considered synonymous for purposes of this disclosure. In some embodiments, a board may contain information beyond which is displayed in a table. Boards may include sub-boards that may have a separate structure from a board. Sub-boards may be tables with sub-items that may be related to the items of a board. Columns intersecting with rows of items may together define cells in which data associated with each item may be maintained. Each column may have a heading or label defining an associated data type. When used herein in combination with a column, a row may be presented horizontally and a column vertically. However, in the broader generic sense as used herein, the term “row” may refer to one or more of a horizontal and/or a vertical presentation. A table or tablature as used herein, refers to data presented in horizontal and vertical rows, (e.g., horizontal rows and vertical columns) defining cells in which data is presented. Tablature may refer to any structure for presenting data in an organized manner, as previously discussed. such as cells presented in horizontal rows and vertical columns, vertical rows and horizontal columns, a tree data structure, a web chart, or any other structured representation, as explained throughout this disclosure. A cell may refer to a unit of information contained in the tablature defined by the structure of the tablature. For example, a cell may be defined as an intersection between a horizontal row with a vertical column in a tablature having rows and columns. A cell may also be defined as an intersection between a horizontal and a vertical row, or as an intersection between a horizontal and a vertical column. As a further example, a cell may be defined as a node on a web chart or a node on a tree data structure. As would be appreciated by a skilled artisan, however, the disclosed embodiments are not limited to any specific structure, but rather may be practiced in conjunction with any desired organizational arrangement. In addition, tablature may include any type of information, depending on intended use. When used in conjunction with a workflow management application, the tablature may include any information associated with one or more tasks, such as one or more status values, projects, countries, persons, teams, progress statuses, a combination thereof, or any other information related to a task.


While a table view may be one way to present and manage the data contained on a board, a table's or board's data may be presented in different ways. For example, in some embodiments, dashboards may be utilized to present or summarize data derived from one or more boards. A dashboard may be a non-table form of presenting data, using, for example, static or dynamic graphical representations. A dashboard may also include multiple non-table forms of presenting data. As discussed later in greater detail, such representations may include various forms of graphs or graphics. In some instances, dashboards (which may also be referred to more generically as “widgets”) may include tablature. Software links may interconnect one or more boards with one or more dashboards thereby enabling the dashboards to reflect data presented on the boards. This may allow, for example, data from multiple boards to be displayed and/or managed from a common location. These widgets may provide visualizations that allow a user to update data derived from one or more boards.


Boards (or the data associated with boards) may be stored in a local memory on a user device or may be stored in a local network repository. Boards may also be stored in a remote repository and may be accessed through a network. In some instances, permissions may be set to limit board access to the board's “owner” while in other embodiments a user's board may be accessed by other users through any of the networks described in this disclosure. When one user makes a change in a board, that change may be updated to the board stored in a memory or repository and may be pushed to the other user devices that access that same board. These changes may be made to cells, items, columns, boards, dashboard views, logical rules, or any other data associated with the boards. Similarly, when cells are tied together or are mirrored across multiple boards, a change in one board may cause a cascading change in the tied or mirrored boards or dashboards of the same or other owners.


Boards and widgets may be part of a platform that may enable users to interact with information in real time in collaborative work systems involving electronic collaborative word processing documents. Electronic collaborative word processing documents (and other variations of the term) as used herein are not limited to only digital files for word processing, but may include any other processing document such as presentation slides, tables, databases, graphics, sound files, video files or any other digital document or file. Electronic collaborative word processing documents may include any digital file that may provide for input, editing, formatting, display, and/or output of text, graphics, widgets, objects, tables, links, animations, dynamically updated elements, or any other data object that may be used in conjunction with the digital file. Any information stored on or displayed from an electronic collaborative word processing document may be organized into blocks. A block may include any organizational unit of information in a digital file, such as a single text character, word, sentence, paragraph, page, graphic, or any combination thereof. Blocks may include static or dynamic information, and may be linked to other sources of data for dynamic updates. Blocks may be automatically organized by the system, or may be manually selected by a user according to preference. In one embodiment, a user may select a segment of any information in an electronic word processing document and assign it as a particular block for input, editing, formatting, or any other further configuration.


An electronic collaborative word processing document may be stored in one or more repositories connected to a network accessible by one or more users through their computing devices. In one embodiment, one or more users may simultaneously edit an electronic collaborative word processing document. The one or more users may access the electronic collaborative word processing document through one or more user devices connected to a network. User access to an electronic collaborative word processing document may be managed through permission settings set by an author of the electronic collaborative word processing document. An electronic collaborative word processing document may include graphical user interface elements enabled to support the input, display, and management of multiple edits made by multiple users operating simultaneously within the same document.


Various embodiments are described herein with reference to a system, method, device, or computer readable medium. It is intended that the disclosure of one is a disclosure of all. For example, it is to be understood that disclosure of a computer readable medium described herein also constitutes a disclosure of methods implemented by the computer readable medium, and systems and devices for implementing those methods, via for example, at least one processor. It is to be understood that this form of disclosure is for ease of discussion only, and one or more aspects of one embodiment herein may be combined with one or more aspects of other embodiments herein, within the intended scope of this disclosure.


Embodiments described herein may refer to a non-transitory computer readable medium containing instructions that when executed by at least one processor, cause the at least one processor to perform a method. Non-transitory computer readable mediums may be any medium capable of storing data in any memory in a way that may be read by any computing device with a processor to carry out methods or any other instructions stored in the memory. The non-transitory computer readable medium may be implemented as hardware, firmware, software, or any combination thereof. Moreover, the software may preferably be implemented as an application program tangibly embodied on a program storage unit or computer readable medium consisting of parts, or of certain devices and/or a combination of devices. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine may be implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described in this disclosure may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such a computer or processor is explicitly shown. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit. Furthermore, a non-transitory computer readable medium may be any computer readable medium except for a transitory propagating signal.


As used herein, a non-transitory computer-readable storage medium refers to any type of physical memory on which information or data readable by at least one processor can be stored. Examples of memory include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, any other optical data storage medium, any physical medium with patterns of holes, markers, or other readable elements, a PROM, an EPROM, a FLASH-EPROM or any other flash memory, NVRAM, a cache, a register, any other memory chip or cartridge, and networked versions of the same. The terms “memory” and “computer-readable storage medium” may refer to multiple structures, such as a plurality of memories or computer-readable storage mediums located within an input unit or at a remote location. Additionally, one or more computer-readable storage mediums can be utilized in implementing a computer-implemented method. The memory may include one or more separate storage devices collocated or disbursed, capable of storing data structures, instructions, or any other data. The memory may further include a memory portion containing instructions for the processor to execute. The memory may also be used as a working scratch pad for the processors or as a temporary storage Accordingly, the term computer-readable storage medium should be understood to include tangible items and exclude carrier waves and transient signals.


Some embodiments may involve at least one processor. Consistent with disclosed embodiments, “at least one processor” may constitute any physical device or group of devices having electric circuitry that performs a logic operation on an input or inputs. For example, the at least one processor may include one or more integrated circuits (IC), including application-specific integrated circuit (ASIC), microchips, microcontrollers, microprocessors, all or part of a central processing unit (CPU), graphics processing unit (GPU), digital signal processor (DSP), field-programmable gate array (FPGA), server, virtual server, or other circuits suitable for executing instructions or performing logic operations. The instructions executed by at least one processor may, for example, be pre-loaded into a memory integrated with or embedded into the controller or may be stored in a separate memory. The memory may include a Random Access Memory (RAM), a Read-Only Memory (ROM), a hard disk, an optical disk, a magnetic medium, a flash memory, other permanent, fixed, or volatile memory, or any other mechanism capable of storing instructions. In some embodiments, the at least one processor may include more than one processor. Each processor may have a similar construction or the processors may be of differing constructions that are electrically connected or disconnected from each other. For example, the processors may be separate circuits or integrated in a single circuit. When more than one processor is used, the processors may be configured to operate independently or collaboratively, and may be co-located or located remotely from each other. The processors may be coupled electrically, magnetically, optically, acoustically, mechanically or by other means that permit them to interact.


Consistent with the present disclosure, disclosed embodiments may involve a network. A network may constitute any type of physical or wireless computer networking arrangement used to exchange data. For example, a network may be the Internet, a private data network, a virtual private network using a public network, a Wi-Fi network, a LAN or WAN network, a combination of one or more of the forgoing, and/or other suitable connections that may enable information exchange among various components of the system. In some embodiments, a network may include one or more physical links used to exchange data, such as Ethernet, coaxial cables, twisted pair cables, fiber optics, or any other suitable physical medium for exchanging data. A network may also include a public switched telephone network (“PSTN”) and/or a wireless cellular network. A network may be a secured network or unsecured network. In other embodiments, one or more components of the system may communicate directly through a dedicated communication network. Direct communications may use any suitable technologies, including, for example, BLUETOOTH™, BLUETOOTH LE™ (BLE), Wi-Fi, near field communications (NFC), or other suitable communication methods that provide a medium for exchanging data and/or information between separate entities.


Certain embodiments disclosed herein may also include a computing device for generating features for work collaborative systems, the computing device may include processing circuitry communicatively connected to a network interface and to a memory, wherein the memory contains instructions that, when executed by the processing circuitry, configure the computing device to receive from a user device associated with a user account instruction to generate a new column of a single data type for a first data structure, wherein the first data structure may be a column oriented data structure, and store, based on the instructions, the new column within the column-oriented data structure repository, wherein the column-oriented data structure repository may be accessible and may be displayed as a display feature to the user and at least a second user account. The computing devices may be devices such as mobile devices, desktops, laptops, tablets, or any other devices capable of processing data. Such computing devices may include a display such as an LED display, augmented reality (AR), virtual reality (VR) display.


Disclosed embodiments may include and/or access a data structure. A data structure consistent with the present disclosure may include any collection of data values and relationships among them. The data may be stored linearly, horizontally, hierarchically, relationally, non-relationally, uni-dimensionally, multidimensionally, operationally, in an ordered manner, in an unordered manner, in an object-oriented manner, in a centralized manner, in a decentralized manner, in a distributed manner, in a custom manner, or in any manner enabling data access. By way of non-limiting examples, data structures may include an array, an associative array, a linked list, a binary tree, a balanced tree, a heap, a stack, a queue, a set, a hash table, a record, a tagged union, ER model, and a graph. For example, a data structure may include an XML database, an RDBMS database, an SQL database or NoSQL alternatives for data storage/search such as, for example, MongoDB, Redis, Couchbase, Datastax Enterprise Graph, Elastic Search, Splunk, Solr, Cassandra, Amazon DynamoDB, Scylla, HBase, and Neo4J. A data structure may be a component of the disclosed system or a remote computing component (e.g., a cloud-based data structure). Data in the data structure may be stored in contiguous or non-contiguous memory. Moreover, a data structure, as used herein, does not require information to be co-located. It may be distributed across multiple servers, for example, that may be owned or operated by the same or different entities. Thus, the term “data structure” as used herein in the singular is inclusive of plural data structures.


Certain embodiments disclosed herein may include a processor configured to perform methods that may include triggering an action in response to an input. The input may be from a user action or from a change of information contained in a user's table or board, in another table, across multiple tables, across multiple user devices, or from third-party applications. Triggering may be caused manually, such as through a user action, or may be caused automatically, such as through a logical rule, logical combination rule, or logical templates associated with a board. For example, a trigger may include an input of a data item that is recognized by at least one processor that brings about another action.


In some embodiments, the methods including triggering may cause an alteration of data and may also cause an alteration of display of data contained in a board or in memory. An alteration of data may include a recalculation of data, the addition of data, the subtraction of data, or a rearrangement of information. Further, triggering may also cause a communication to be sent to a user, other individuals, or groups of individuals. The communication may be a notification within the system or may be a notification outside of the system through a contact address such as by email, phone call, text message, video conferencing, or any other third-party communication application.


Some embodiments include one or more of automations, logical rules, logical sentence structures and logical (sentence structure) templates. While these terms are described herein in differing contexts, in a broadest sense, in each instance an automation may include a process that responds to a trigger or condition to produce an outcome; a logical rule may underly the automation in order to implement the automation via a set of instructions; a logical sentence structure is one way for a user to define an automation; and a logical template/logical sentence structure template may be a fill-in-the-blank tool used to construct a logical sentence structure. While all automations may have an underlying logical rule, all automations need not implement that rule through a logical sentence structure. Any other manner of defining a process that respond to a trigger or condition to produce an outcome may be used to construct an automation.


Other terms used throughout this disclosure in differing exemplary contexts may generally share the following common definitions.


In some embodiments, machine learning algorithms (also referred to as machine learning models or artificial intelligence in the present disclosure) may be trained using training examples, for example in the cases described below. Some non-limiting examples of such machine learning algorithms may include classification algorithms, data regressions algorithms, image segmentation algorithms, visual detection algorithms (such as object detectors, face detectors, person detectors, motion detectors, edge detectors, etc.), visual recognition algorithms (such as face recognition, person recognition, object recognition, etc.), speech recognition algorithms, mathematical embedding algorithms, natural language processing algorithms, support vector machines, random forests, nearest neighbors algorithms, deep learning algorithms, artificial neural network algorithms, convolutional neural network algorithms, recursive neural network algorithms, linear machine learning models, non-linear machine learning models, ensemble algorithms, and so forth. For example, a trained machine learning algorithm may comprise an inference model, such as a predictive model, a classification model, a regression model, a clustering model, a segmentation model, an artificial neural network (such as a deep neural network, a convolutional neural network, a recursive neural network, etc.), a random forest, a support vector machine, and so forth. In some examples, the training examples may include example inputs together with the desired outputs corresponding to the example inputs. Further, in some examples, training machine learning algorithms using the training examples may generate a trained machine learning algorithm, and the trained machine learning algorithm may be used to estimate outputs for inputs not included in the training examples. In some examples, engineers, scientists, processes and machines that train machine learning algorithms may further use validation examples and/or test examples. For example, validation examples and/or test examples may include example inputs together with the desired outputs corresponding to the example inputs, a trained machine learning algorithm and/or an intermediately trained machine learning algorithm may be used to estimate outputs for the example inputs of the validation examples and/or test examples, the estimated outputs may be compared to the corresponding desired outputs, and the trained machine learning algorithm and/or the intermediately trained machine learning algorithm may be evaluated based on a result of the comparison. In some examples, a machine learning algorithm may have parameters and hyper parameters, where the hyper parameters are set manually by a person or automatically by a process external to the machine learning algorithm (such as a hyper parameter search algorithm), and the parameters of the machine learning algorithm are set by the machine learning algorithm according to the training examples. In some implementations, the hyper-parameters are set according to the training examples and the validation examples, and the parameters are set according to the training examples and the selected hyper-parameters.


Disclosed embodiments may include at least one processor may be configured to execute stored instructions to perform table navigation operations on a display. In some embodiments, the table may comprise rows and columns and at least one cell that may contain more information than presented on the display. In some embodiments, the table may include a data structure such as a relational database having entries organized in rows and columns, with data occupying or potentially occupying each cell formed by a row-column intersection.


As used in this disclosure, the term “display” refers either to any physical device capable of providing a visual representation of a table or directly to a visual representation of a table. Examples of physical devices acting as displays may include computer screens, smartphone screens, tablet screens, smartwatch screens, laptop screens, video walls, projectors, head-mounted displays or virtual reality headsets. Additionally, displays may utilize graphical user interfaces (GUIs) to permit user interaction with data. In many GUIs, a visual representation of table is often provided using a graphical user interface component known as a window. Any visual representation of a device or display may be characterized by dimensions, these dimensions are limited to the physical dimensions of the device, and often, a table may not be completely presented on a display device or fit a window. This drawback is common among devices with small screens, such as smartphones or tablets. The dimensions of the table cells may therefore be limited by the dimensions of the display.


Above and throughout this disclosure, the term “information” or “piece of information” may refer to any type of data with a visual representation, such as text, images, numbers, diagrams, drawings or GUI components that may be included in a table cell. Each piece of information may have an associated size, such as a size that a piece of information may have when fully presented on a display. The associated size of an information may be larger than the dimensions of a table cell, in such a situation the entirety of the information may therefore be not displayed in the table cell, or fit the table cell. Accordingly, the information may be split into two parts, a first part that is visible, and presented on a display and a second part that is hidden and not presented on the display.



FIG. 1 is a flowchart of an exemplary process (100) for performing table navigation operations on a display that may be executed by at least one processor. Process 100 is discussed herein for explanatory purposes and is not intended to be limiting. In some embodiments, steps of process 100 may be changed, modified, substituted, or rearranged, consistent with the present disclosure. Process 100 may be implemented using one or more components of a computing device 400 (discussed in FIG. 4) or user device 520 of computing architecture 500 (discussed in FIG. 5). Disclosed embodiments may include at least one processor that may be configured to execute stored instructions to perform table navigation operations on a display. In some embodiments, the table may comprise rows and columns and at least one cell that may contain more information than presented on the display. As shown in FIG. 1, process 100 may include steps 102, 104, 106, 108, and 110, as shown, discussed in further detail below.



FIG. 2 is an exemplary illustration of a table presented on a display comprising rows and columns and at least one cell that contains more information than presented on the display, in accordance with the disclosed embodiments. FIG. 2 illustrates table 200 comprising a plurality of rows 202 and columns 204 presented on a display 206, the dimensions of which are illustrated by a dotted line. Table 200 cells comprise different types of information, such as text, pictures, numbers, or dates. Table 200 comprises three cells that contain more information than presented on the display 206. These three cells are all included in the first column 204-1 and are located in rows 202-1, 202.3, and 202-7 respectively. The first two cells (202-1/204-1, 202-3/204-1) comprise a text that is not entirely displayed as indicated by the presence of suspension points. Cell (202-7/204-1) includes a histogram, which is larger in size than cell (202-7/204-1), thus it is displayed in a cropped version where only the top of the histogram is visible. Note in the situation illustrated in FIG. 2 table 200 is not entirely presented on display 206, the whole of table 200 is shown to get a sense of the actual dimensions of table 200, portions of table 200 that are not comprised within the dashed line are not presented on display 206.


In some embodiments, all table cells may share at least one common dimension. A common dimension may refer to a common width, a common height or both a common height and width. For example, as illustrated in FIG. 2, table cells from the first column 204-1 are sharing a common height with table cells from the other columns (204-2, 204-3, 204-4) but they have a different width. In some other embodiments, all table cells may have different dimensions. The fact that all the cells in the table do or do not share at least one common dimension may be the result of a specific set of instructions. This instruction set may have different origins, for example, it can be directly related to the display, the displayed table or a user input.


In some embodiments, the table may not be entirely displayed. For example, some rows or columns may not be visible as the table does not fit the dimensions of the display. This situation is illustrated in FIG. 2 where only the first three columns (204-1 through 204-3) fit within the dimensions of display 206. To permit a user to access different portions of a table that are not displayed a scroll signal for scrolling the table may be used.


Referring again to FIG. 1, in step 102, at least one processor may receive a scroll signal for scrolling the table, wherein the scroll signal results from a motion of a user on the display. A scroll signal may be a command or input that causes a displayed page or document to move on the display, or in a window on the display, to a particular portion of the page/document. Consistent with disclosed embodiments, the scroll signal may be the result of, for example, a motion performed by a user on a touch-sensitive layer of the display, user control of a computer mouse or stylus, or computer input with one or more keys on a keyboard. This motion may correspond to manipulating one or more device controls or GUI components such as scroll bars, a particular movement made by the user or any type of user input related to the display, for scrolling the table. For example, a scroll signal can be the result of moving a cursor docked on or near a scroll bar, clicking on increment/decrement control buttons, using the scroll wheel button of a mouse, or a user performing one or more recognized gestures such as a swipe motion relative to the display.


Referring again to FIG. 1, in step 104, in response to a vertical component in the motion, the at least one processor may cause the table to scroll vertically. As discussed herein, the display may have a vertical dimension extending along a first axis, such as a vertical axis, and a horizontal dimension extending along a second axis perpendicular to the first axis, such as a horizontal axis. Consistent with this disclosure, the motion may comprise at least one of a vertical or horizontal component. A vertical or horizontal component of motion may refer to a part of the direction of motion aligned with the vertical or horizontal axis or to an interaction with a GUI component that causes a scrolling along the vertical or horizontal axis. The at least one processor may be configured to detect a motion of the user using one or more input devices, such as a touch-sensitive layer of the display screen, and analyze the motion to detect one or more of a horizontal or vertical component of the motion. The at least one processor may move the page vertically, such as upward or downward, on the display, in response to the vertical component of the motion.


In step 106, in response to a horizontal component in the motion, the at least one processor may cause the table to scroll horizontally. Similar to the vertical scroll discussed above, the at least one processor may be configured to detect a motion of the user using one or more input devices, such as a touch-sensitive layer of the display screen, and analyze the motion to detect a horizontal component of the motion. The at least one processor may move the page horizontally, such as left or right, on the display, in response to the horizontal component of the motion.


For example, in the situation illustrated in FIG. 2, a motion with a horizontal (leftward) component may cause table 200 to scroll horizontally, which may cause a larger portion of the fourth column 204-4 to appear on the display 206. In some embodiments, scrolling the table may be proportional to a length of the motion. In the context of this description, a “length” of the motion may refer to a physical length, a time duration of motion or a combination thereof.


In step 108, the at least one processor may receive an enhancing scroll signal resulting from an enhancing motion for revealing hidden information within the at least one cell on the table. In accordance with the disclosed embodiments, upon reception of an enhancing scroll signal, portions of the hidden information within the at least one cell may be revealed, wherein the enhancing scroll signal may be the result of an enhancing motion for revealing hidden information within the at least one cell. An enhancing motion may correspond to manipulating various controls on GUI components, a particular movement made by the user or any type of user input related to the display, for revealing hidden information within the at least one cell. Examples of an enhancing motion may include performing a swipe motion relative to the display that is interpreted differently than the scroll signal, persistently clicking at a particular location of the display, double or repeatedly tapping a particular location on the display, performing a pinching out motion relative to the display, and other motions or commands that are interpreted differently than a scroll signal due to the satisfaction of one or more conditions. In some embodiments, the enhancing motion may be different from a motion capable of scrolling the table horizontally or vertically.


In step 110, in response to the enhancing scroll signal, the at least one processor may reveal at least a portion of the hidden information within the at least one cell. In some embodiments, responsive to receipt of an enhancing motion, the at least one cell may be displayed in an enhanced form, such that a portion of the hidden information may be revealed.


In some embodiments, revealing the hidden information may be progressive, and a ratio between an amount of the information revealed and an amount of information initially hidden may be proportional to a length of the enhancing motion. In the context of this description, a progressive revelation may refer to a situation in which the transition from a state in which some of the information in a table cell is hidden to a state in which all of the information is visible in the table cell is not sudden. Instead, the revelation may proceed continuously or step by step. For example, if a fairly long text is included in a cell and only part of it is visible, instead of revealing all the missing words at once, the process may be progressive, revealing the hidden part of the text word by word or sentence by sentence. To carry out this progressive revelation, a ratio between a quantity of revealed information and a quantity of information initially hidden, such as prior to any enhancing scroll motion can be used. As an example, one may consider a situation where the information contained in a cell is an image and only one-third of the image is visible. Two-thirds of the image may hidden, and after the revelation process starts, half of the image may become visible, such that the ratio between the amount of information revealed and the amount of information initially hidden is 25%. The revelation process continues, two-thirds of the image is now visible, and the ratio is 50%. The process continues and ends when the ratio is 100%.


In some embodiments, the ratio may be proportional to a length of the enhancing motion. A length of the motion may include a physical length of the motion on the display, a time duration of motion, or a combination thereof. For example, if the enhancing motion corresponds to a swipe motion made by a user across a display, the longer the distance of the swipe motion across the display, the larger the ratio between the quantity of information revealed to the quantity of information initially hidden. In another example, if the enhancing movement corresponds to a persistent touch of a part of a display, the longer the duration of the persistent touch, the larger the ratio between the quantity of revealed information and the quantity of information initially hidden.


In some embodiments, “revealing” hidden information may include making visible at least a part of the information that was not previously visible, accessible, or presented on a display. For example, referring to FIG. 2, revealing the histogram included in cell (202-7/204-1) may correspond to displaying the whole histogram.


In some embodiments, the enhancing motion may correspond to a motion capable of scrolling the table horizontally or vertically. In such embodiments, a result of the motion may correspond to at least one of: revealing hidden information within the at least one cell on the table if the motion is performed on a predetermined portion of the display; or, causing the table to scroll horizontally or vertically if the motion is performed on a portion different from the predetermined portion of the display. In some embodiments, a predetermined portion of the display may refer to a portion of the display having certain characteristics. Examples of a predetermined portion of a display may include, a corner, an edge, or a zone of a display. Consistent with this disclosure, performing a motion capable of scrolling the table horizontally or vertically on a predetermined zone may result in revealing hidden information within at least one cell.


Referring to FIG. 2, a predetermined portion of display 206 may correspond to the leftmost area of display 206 with a height corresponding to the height of the display and a width corresponding to 10% of the width of the display. If the motion capable of scrolling the table horizontally is a horizontal swipe motion, a horizontal swipe motion starting within the predetermined portion may result in the revelation of hidden information, and a swipe motion starting outside the predetermined portion (for example in the central zone) may result in the horizontal scrolling of the table.


In some embodiments, revealing the hidden information may cease after the enhancing motion terminates. Ceasing to reveal hidden information may refer to the end of the revelation process, regardless of whether all the hidden information in the at least one cell has been revealed. For example, in a situation wherein a ratio between an amount of the information revealed and an amount of information initially hidden is proportional to a length of the enhancing motion, the enhancing motion may terminate before the length of the enhancing motion reach a value corresponding to a ratio equal to 100%. In some embodiments, ceasing to reveal hidden information in the at least one cell may comprise restoring an initial situation in which none of the hidden information in the at least one cell is revealed. For example, if 60% of the initially hidden information of at least one cell was revealed before the enhancing motion terminates after the enhancing motion terminates the initial situation in which 0% of the initially hidden information is revealed will be presented on the display. In some embodiments, restoring the initial situation in which none of the hidden information in the at least one cell is revealed may be progressive or sudden.


In some embodiments, the hidden information may be revealed for a predetermined time period. Revealing for a predetermined time period may involve the at least one cell remaining in an enhanced form for a predetermined amount of time after the enhancing motion terminates, or after another trigger consistent with disclosed embodiments. Examples of a predetermined time period may include 1, 2, 5, 10 seconds or any suitable time period. In some embodiments, once the predetermined time is over, the displayed table may revert to its initial appearance, in which the revealed information in the at least one cell becomes hidden again. For example, if at least one cell is in an enhanced form after receiving an enhancement scroll signal, the at least one cell may remain in the enhanced form for a predetermined period of time (for example, 2 seconds) before returning to an initial state. In some further embodiments, restoring the initial situation in which none of the hidden information in the at least one cell is revealed may be progressive or sudden.


In some embodiments, the enhancing scroll signal may be based on a scroll motion performed beyond an end of the table. An end of the table may me associated with any characteristics of the outermost edges, locations, or sides of the table. An ends of a table may include, for example, a first or last row, a first or last column, outer borders, outer corners, or row or column labels. For example, with reference to FIG. 2, one end of the table 200 may correspond to the first column 204-1, the enhancement movement may be based on a scrolling movement beyond the first column 204-1, when no scrolling to the left of the table is possible since the left edge of the first column is displayed.


In some embodiments, the at least one cell may be located at a first end of the table or at a second end of the table opposite the first end. A first end of the table may correspond to a first row or column, and a second end of the table opposite to the first end may correspond to a last row or column. For example, as illustrated in FIG. 2, the three cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)), that contain more information than presented on the display are all included in the first column 204-1.


In some further embodiments, wherein the enhancing scroll signal is based on a scroll motion performed beyond an end of the table, the hidden information may be revealed while the enhanced scroll motion is performed on a predetermined portion of the display. As mentioned above, a predetermined portion of the display may refer to a portion of the display having certain characteristics. With reference to FIG. 2, a predetermined portion of display 206 may correspond to the leftmost area of display 206 with a height corresponding to the height of the display and a width corresponding to 20% of the width of the display. If the enhancement scroll signal is based on a scrolling motion capable of scrolling the display horizontally, performed beyond one end of the table (e.g., the left edge of table 200) as a horizontal swipe motion, a leftward horizontal swipe motion starting inside the predetermined portion performed beyond the left edge of table 200 may result in the revelation of hidden information, and a leftward horizontal swipe motion starting outside the predetermined portion performed beyond the left edge table 200 may result in the table scrolling horizontally to the left edge of table 200.


In some embodiments where the enhancing scroll signal is based on a scroll motion performed beyond an end of the table, the at least one processor may cause both scrolling and a partial revealing of the hidden information if the scroll signal exceeds a threshold. In the context of this description, a threshold may refer to a certain length of a scroll signal where the scroll signal may be initially executed to scroll the table and is long enough to scroll the table beyond one end of the table. Examples of a threshold may include a physical length threshold, a time duration of motion threshold or a combination thereof. In accordance with this disclosure, such a scroll signal, rather than scrolling only to one end of the table, may also cause a partial revelation of information hidden in at least one cell. A partial revealing of the hidden information may refer to a certain percentage of the information being revealed. In some embodiments, a partial revealing of the hidden information may include revealing a percentage of the hidden information strictly below 100%. A user without prior knowledge of the existence of an enhancing scrolling motion may discover this feature by performing a scrolling signal exceeding the threshold.


In some embodiments, revealing the hidden information may include at least one of: enlarging the at least one cell in order to present all the information contained within the at least one cell; or modifying a size of all information contained within the at least one cell to enable presentation of all the information contained within the at least one cell without changing dimensions of the at least one cell. Enlarging the at least one cell may refer to increasing at least one of the dimensions of the at least one cell. Modifying a size of all information contained within the at least one cell without changing dimensions of the at least one cell, on the other hand may refer to reducing the associated size of all information contained within the at least one cell, to fit within the dimensions of the at least one cell. Accordingly, a cell in an enhanced form may either have at least one of its dimensions increased or comprise a piece of information with reduced dimensions. In other embodiments, revealing the hidden information may include a combination of enlarging the at least one cell and modifying a size of all information contained within the at least one cell in order to present all the information contained within the at least one cell. In such a situation, a compromise may be found between dimensions that are too large for a cell and dimensions that are too small for a piece of information. In some other embodiments, revealing the hidden information may include enlarging the dimensions of the at least one cell to the dimensions of the display, and if in the enlarged version of the at least one cell not all of the hidden information is revealed, further modifying a size of the information to enable a presentation of all of the information contained in the enlarged version of the at least one cell.



FIGS. 3A and 3B illustrate two tables presented on a display comprising rows and columns after reception of an enhancing scroll signal, the at least one cell that initially contained more information than presented on the display is in an enhanced form, in accordance with the disclosed embodiments. Tables illustrated in these figures correspond to table 200 illustrated before in FIG. 2 wherein the three cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) that initially contained more information than presented on display 206 are now in an enhanced form wherein all of the information comprised in these cells is now visible. The text included in the first two cells (202-1/204-1, 202-3/204-1) is now entirely displayed and the histogram included in the third cell (202-7/204-1) is now fully displayed. Note in the situation illustrated in FIGS. 3A and 3B table 200 is not entirely presented on display 206, the whole of table 200 is shown to get a sense of the actual dimensions of table 200 wherein the three cells are in the enhanced form, portions of table 200 that are not comprised within the dashed line are not presented on display 206.


In some embodiments, the enlarged at least one cell may include at least one of an increased height, an expansion in width, or an expansion in both height and width. For example, as illustrated in FIG. 3A, the height of the three cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) has been increased, so that all the information comprised in these cells is now visible. Note that in the situation illustrated in FIG. 3A, since the height of cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) has been increased, table 200 no longer fits the vertical dimension of display 206, and only the first five rows (202-1 through 202-5) and first three columns (204-1 through 204-2) are presented on display 206. Further, in some embodiments, enlarging the at least one cell may include increasing a first dimension of the at least one cell until it reaches a corresponding display dimension, and if the enlarged version of the at least one cell is not sufficient to reveal all of the hidden information, increasing a second dimension of the at least one cell. For example, referring to FIG. 3A, the enlargement of the three cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) could have started by enlarging the width of these cells until the horizontal dimension of display 206 is reached and if in this situation not all hidden information was revealed, the enlargement of the cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) could have continued by enlarging the height of these cells. In yet other embodiments, enlarging the at least one cell may include increasing a first dimension of the at least one cell until it reaches a first corresponding display dimension and increasing a second dimension of the at least one cell beyond a second corresponding display dimension. In a situation where the information included in the at least one cell has dimensions greater than the dimensions of the display, enlarging the at least one cell may require increasing at least one of the dimensions of the at least one cell beyond one of the dimensions of the display.


In some embodiments, at least one dimension of one or more cells adjacent to the at least one cell may remain unchanged. A cell adjacent to the at least one cell may refer to a cell that shares at least one edge with the at least one cell. For example, as illustrated in FIG. 3B, in which the associated size of the information comprised in cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) has been decreased to fit the dimensions of cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)), all the dimensions of all the cells adjacent to cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) may remain unchanged. In another example, as illustrated in FIG. 3A, in which the height of cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) has been increased to reveal all of the hidden information comprised in those cells, some of the cells adjacent to the cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)), as cells (202-2/204-1), (202-3/204-1) and (202-6/204/1) have unchanged widths and heights, some of the cells adjacent to the cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) as cells (202-1/204-2), (202-3/204-2) and (202-7/204-2) have unchanged widths and increased heights.


In some embodiments, after the enhancing scroll signal terminates, the enlarged at least one cell may return to its original dimensions, or the size of all information contained within the at least one cell may return to its original size. The return to the original dimensions of at least one cell or the return to the original size of a piece of information included in at least one cell can occur independently of the fact that all the information hidden in at least one cell has been revealed. For example, in FIG. 3A, once the enhancing scroll signal is completed, the heights of cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) may return to their original values, and as shown in FIG. 3B, once the enhancing scroll signal is completed, the sizes of the pieces of information comprised in cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) may return to their original associated sizes. Thus, the display returns to the appearance as illustrated in FIG. 2.


In some embodiments, the enhancing scroll signal may cause a font size to change. For example, as illustrated in FIG. 3B, during the process of reducing the size of the texts comprised in cells ((202-1/204-1), (202-3/204-1)) the font size of the texts has been reduced. Further, in some embodiments, a minimum font size may be set and if, after reducing the font size to the minimum font size, all of the information included in the at least one cell is not revealed, the at least one cell may be enlarged. For example, referring to FIG. 3B, the font size of cell (202-3/204-1) may be less than a minimum font size, revealing the hidden information of cell (202-3/204-1) could have been done by first reducing the font size to the minimum font size, and then increasing the height of cell (202-3/204-1) so that all the text included in that cell is displayed with the minimum font size.


In some embodiments, revealing the hidden information may further include enlarging the at least one cell, and wrapping one or more lines of text contained within the at least one cell. For example, in FIG. 3A, the height of cells ((202-1/204-1), (202-3/204-1)) has been increased and the texts comprised in these cells are now entirely visible in a form where one or more line of these texts have been wrapped. In some embodiments, “wrapping” may involve a word processing or text-editing process or subroutine breaking lines of text, or lengthening lines of text, automatically, to display an increased amount of text within certain margins or boundaries, without requiring a user to manually move or edit the text. In the context of the disclosed embodiments, wrapping may cause the content of a cell to become more readily visible, or fully visible, when the dimensions of the cell are altered to yield additional space for displaying the content.


In some embodiments, the at least one processor lock the at least one cell in an enhanced form until an unlock signal is received. An unlock signal may refer to a signal generated upon reception of any type of user input related to the display. Examples of user input may include manipulating various controls on GUI components or performing a particular movement. For example, a swipe motion or a pinching in motion may result in the generation of an unlock signal. In some embodiments, a motion that is reversed from the enhancing motion may lead to the generation of a unlock signal. Examples of motion/reversed motion may include pinching in/pinching out or scrolling beyond an end of a table/scrolling toward an end of a table. In some embodiments repeating the enhancing motion when the at least one cell is an enhanced form may lead to the generation of an unlock signal. For example, if the enhancement movement corresponds to a double tap on a particular portion of the screen, a further double tap on the particular portion of the screen may lead to the generation of an unlock signal and trigger the restoration of an initial state in which the at least one cell includes more information than that displayed. In some other embodiments, locking the at least one cell in an enhanced form, may include locking the at least one cell only if all the hidden information comprised in the at least one cell is revealed.


In some embodiments, in a situation wherein the at least one cell is locked in an enhanced form, and the unlock signal is not the result of a motion capable to scroll the table horizontally or vertically, one or more scroll signals may be received for causing the table with the at least one cell in an enhanced form to scroll vertically or horizontally. Referring to FIG. 3A, once cells ((202-1/204-1), (202-3/204-1), (202-7/204-1)) are locked in their enhanced form, a vertical scrolling signal may be received to scroll vertically table 200 and present on the display 206 portions of table 200 that were not visible before, such as cell (202-7/204-1) in an enhanced form.


In some embodiments, a system may be configured to perform table navigation operations on a display. As discussed herein, the table may include rows and columns and at least one cell that contains more information than it presents. The system may comprise a memory storing instructions and at least one processor that executes the stored instructions to perform operations. Consistent with FIG. 1 described above, the at least one processor may receive a scroll signal for scrolling the table, wherein the scroll signal results from a motion of a user on the display. In response to a vertical component in the motion, the at least one processor may cause the table to scroll vertically. In response to a horizontal component in the motion, the at least one processor may cause the table to scroll horizontally. The at least one processor may receive an enhancing scroll signal resulting from an enhancing motion for revealing hidden information within the at least one cell on the table. In response to the enhancing scroll signal, the at least one processor may reveal at least a portion of the hidden information within the at least one cell.



FIG. 4 is a block diagram of an exemplary computing device 1200 consistent with some embodiments. In some embodiments, computing device 400 may be similar in type and function to user device 420, discussed with respect to FIG. 5. As shown in FIG. 4, computing device 400 may include processing circuitry 410, such as, for example, a central processing unit (CPU). In some embodiments, the processing circuitry 410 may include, or may be a component of, a larger processing unit implemented with one or more processors. The one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate array (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information. The processing circuitry such as processing circuitry 410 may be coupled via a bus 405 to a memory 420.


The memory 420 may further include a memory portion 422 that may contain instructions that when executed by the processing circuitry 410, may perform the method described in more detail herein. The memory 420 may be further used as a working scratch pad for the processing circuitry 410, a temporary storage, and others, as the case may be. The memory 420 may be a volatile memory such as, but not limited to, random access memory (RAM), or non-volatile memory (NVM), such as, but not limited to, flash memory. The processing circuitry 410 may be further connected to a network device 440, such as a network interface card, for providing connectivity between the computing device 400 and a network, such as a network 410, discussed in more detail with respect to FIG. 5 below. The processing circuitry 410 may be further coupled with a storage device 430. The storage device 430 may be used for the purpose of storing single data type column-oriented data structures, data elements associated with the data structures, or any other data structures. While illustrated in FIG. 4 as a single device, it is to be understood that storage device 430 may include multiple devices either collocated or distributed.


The processing circuitry 410 and/or the memory 420 may also include machine-readable media for storing software. “Software” as used herein refers broadly to any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the one or more processors, may cause the processing system to perform the various functions described in further detail herein.


In some embodiments, computing device 400 may include one or more input and output devices (not shown in figure). Computing device may also include a display 450, such as a touchscreen display or other display types discussed herein.



FIG. 5 is a block diagram of computing architecture 500 that may be used in connection with various disclosed embodiments. The computing device 400, as described in connection with FIG. 4, may be coupled to network 510. The network 510 may enable communication between different elements that may be communicatively coupled with the computing device 400, as further described below. The network 510 may include the Internet, the world-wide-web (WWW), a local area network (LAN), a wide area network (WAN), a metro area network (MAN), and other networks capable of enabling communication between the elements of the computing architecture 500. In some disclosed embodiments, the computing device 400 may be a server deployed in a cloud computing environment.


One or more user devices 520-1 through user device 520-m, where ‘m’ in an integer equal to or greater than 1, referred to individually as user device 520 and collectively as user devices 520, may be communicatively coupled with the computing device 400 via the network 510. A user device 520 may be for example, a smart phone, a mobile phone, a laptop, a tablet computer, a wearable computing device, a personal computer (PC), a smart television and the like. A user device 520 may be configured to send to and receive from the computing device 400 data and/or metadata associated with a variety of elements associated with single data type column-oriented data structures, such as columns, rows, cells, schemas, and the like.


One or more data repositories 530-1 through data repository 530-n, where ‘n’ in an integer equal to or greater than 1, referred to individually as data repository 530 and collectively as data repository 530, may be communicatively coupled with the computing device 400 via the network 510, or embedded within the computing device 400. Each data repository 530 may be communicatively connected to the network 510 through one or more database management services (DBMS) 535-1 through DBMS 535-n. The data repository 530 may be for example, a storage device containing a database, a data warehouse, and the like, that may be used for storing data structures, data items, metadata, or any information, as further described below. In some embodiments, one or more of the repositories may be distributed over several physical storage devices, e.g., in a cloud-based computing environment. Any storage device may be a network accessible storage device, or a component of the computing device 400.


The embodiments disclosed herein are exemplary and any other means for performing and facilitating display navigation operations may be consistent with this disclosure.


The foregoing description has been presented for purposes of illustration. It is not exhaustive and is not limited to the precise forms or embodiments disclosed. Modifications and adaptations will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed embodiments.


Moreover, while illustrative embodiments have been described herein, the scope of any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations and/or alterations as would be appreciated by those skilled in the art based on the present disclosure. The limitations in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application. The examples are to be construed as non-exclusive. Furthermore, the steps of the disclosed methods may be modified in any manner, including by reordering steps and/or inserting or deleting steps. It is intended, therefore, that the specification and examples be considered as illustrative only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.


Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. The materials, methods, and examples provided herein are illustrative only and not intended to be limiting.


Implementation of the method and system of the present disclosure may involve performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present disclosure, several selected steps may be implemented by hardware (HW) or by software (SW) on any operating system of any firmware, or by a combination thereof. For example, as hardware, selected steps of the disclosure could be implemented as a chip or a circuit. As software or algorithm, selected steps of the disclosure could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the disclosure could be described as being performed by a data processor, such as a computing device for executing a plurality of instructions.


As used herein, the terms “machine-readable medium” “computer-readable medium” refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.


Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.


Although the present disclosure is described with regard to a “computing device”, a “computer”, or “mobile device”, it should be noted that optionally any device featuring a data processor and the ability to execute one or more instructions may be described as a computing device, including but not limited to any type of personal computer (PC), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, a smart watch or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a “network” or a “computer network.”


To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (a LED (light-emitting diode), or OLED (organic LED), or LCD (liquid crystal display) monitor/screen) for displaying information to the user and a touch-sensitive layer such as a touchscreen, or keyboard and a pointing device (e.g., a mouse or a trackball), by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.


The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.


It should be appreciated that the above described methods and apparatus may be varied in many ways, including omitting or adding steps, changing the order of steps and the type of devices used. It should be appreciated that different features may be combined in different ways. In particular, not all the features shown above in a particular embodiment or implementation are necessary in every embodiment or implementation of the invention. Further combinations of the above features and implementations are also considered to be within the scope of some embodiments or implementations of the invention.


While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the implementations. It should be understood that they have been presented by way of example only, not limitation, and various changes in form and details may be made. Any portion of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The implementations described herein can include various combinations and/or sub-combinations of the functions, components and/or features of the different implementations described.


Systems and methods disclosed herein involve unconventional improvements over conventional approaches. Descriptions of the disclosed embodiments are not exhaustive and are not limited to the precise forms or embodiments disclosed. Modifications and adaptations of the embodiments will be apparent from consideration of the specification and practice of the disclosed embodiments. Additionally, the disclosed embodiments are not limited to the examples discussed herein.


The foregoing description has been presented for purposes of illustration. It is not exhaustive and is not limited to the precise forms or embodiments disclosed. Modifications and adaptations of the embodiments will be apparent from consideration of the specification and practice of the disclosed embodiments. For example, the described implementations include hardware and software, but systems and methods consistent with the present disclosure may be implemented as hardware alone.


It is appreciated that the above described embodiments can be implemented by hardware, or software (program codes), or a combination of hardware and software. If implemented by software, it can be stored in the above-described computer-readable media. The software, when executed by the processor can perform the disclosed methods. The computing units and other functional units described in the present disclosure can be implemented by hardware, or software, or a combination of hardware and software. One of ordinary skill in the art will also understand that multiple ones of the above described modules/units can be combined as one module or unit, and each of the above described modules/units can be further divided into a plurality of sub-modules or sub-units.


The block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer hardware or software products according to various example embodiments of the present disclosure. In this regard, each block in a flowchart or block diagram may represent a module, segment, or portion of code, which includes one or more executable instructions for implementing the specified logical functions. It should be understood that in some alternative implementations, functions indicated in a block may occur out of order noted in the figures. For example, two blocks shown in succession may be executed or implemented substantially concurrently, or two blocks may sometimes be executed in reverse order, depending upon the functionality involved. Some blocks may also be omitted. It should also be understood that each block of the block diagrams, and combination of the blocks, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or by combinations of special purpose hardware and computer instructions.


In the foregoing specification, embodiments have been described with reference to numerous specific details that can vary from implementation to implementation. Certain adaptations and modifications of the described embodiments can be made. Other embodiments can be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as example only, with a true scope and spirit of the invention being indicated by the following claims. It is also intended that the sequence of steps shown in figures are only for illustrative purposes and are not intended to be limited to any particular sequence of steps. As such, those skilled in the art can appreciate that these steps can be performed in a different order while implementing the same method.


It will be appreciated that the embodiments of the present disclosure are not limited to the exact construction that has been described above and illustrated in the accompanying drawings, and that various modifications and changes may be made without departing from the scope thereof.


Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed embodiments disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosed embodiments being indicated by the following claims.


Computer programs based on the written description and methods of this specification are within the skill of a software developer. The various programs or program modules can be created using a variety of programming techniques. One or more of such software sections or modules can be integrated into a computer system, non-transitory computer readable media, or existing software.


This disclosure employs open-ended permissive language, indicating for example, that some embodiments “may” employ, involve, or include specific features. The use of the term “may” and other open-ended terminology is intended to indicate that although not every embodiment may employ the specific disclosed feature, at least one embodiment employs the specific disclosed feature.


Various terms used in the specification and claims may be defined or summarized differently when discussed in connection with differing disclosed embodiments. It is to be understood that the definitions, summaries and explanations of terminology in each instance apply to all instances, even when not repeated, unless the transitive definition, explanation or summary would result in inoperability of an embodiment.


Moreover, while illustrative embodiments have been described herein, the scope includes any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations or alterations based on the present disclosure. The elements in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application. These examples are to be construed as non-exclusive. Further, the steps of the disclosed methods can be modified in any manner, including by reordering steps or inserting or deleting steps. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.

Claims
  • 1. A non-transitory computer-readable medium containing instructions that, when executed, cause at least one processor to perform table navigation operations on a display, wherein the table comprises rows and columns and at least one cell that contains more information than presented on the display, the operations comprising: receiving a scroll signal for scrolling the table, wherein the scroll signal results from a motion of a user on the display;in response to a vertical component in the motion, causing the table to scroll vertically;in response to a horizontal component in the motion, causing the table to scroll horizontally;receiving an enhancing scroll signal resulting from an enhancing motion for revealing hidden information within the at least one cell on the table; andin response to the enhancing scroll signal, revealing at least a portion of the hidden information within the at least one cell.
  • 2. The non-transitory computer-readable medium of claim 1, wherein revealing the hidden information is progressive, and a ratio between an amount of the information revealed and an amount of information that is initially hidden is proportional to a length of the enhancing motion.
  • 3. The non-transitory computer-readable medium of claim 1, wherein the enhancing motion is different from a motion capable of scrolling the table horizontally or vertically.
  • 4. The non-transitory computer-readable medium of claim 1, wherein the enhancing motion corresponds to a motion capable of scrolling the table horizontally or vertically, a result of the motion corresponds to at least one of: revealing hidden information within the at least one cell on the table if the motion is performed on a predetermined portion of the display; or,causing the table to scroll horizontally or vertically if the motion is performed on a portion different from the predetermined portion of the display.
  • 5. The non-transitory computer-readable medium of claim 1, wherein revealing the hidden information ceases after the enhancing motion terminates.
  • 6. The non-transitory computer-readable medium of claim 1, wherein the hidden information is revealed for a predetermined time period.
  • 7. The non-transitory computer-readable medium of claim 1, wherein the enhancing scroll signal is based on a scroll motion performed beyond an end of the table.
  • 8. The non-transitory computer-readable medium of claim 7, wherein the at least one cell is located at a first end of the table or at a second end of the table opposite the first end.
  • 9. The non-transitory computer-readable medium of claim 7, wherein the hidden information is revealed while the enhanced scroll motion is performed on a predetermined portion of the display.
  • 10. The non-transitory computer-readable medium of claim 7, where the operations further comprise, if the scroll signal exceeds a threshold, causing both scrolling and a partial revealing of the hidden information.
  • 11. The non-transitory computer-readable medium of claim 1, wherein revealing the hidden information includes at least one of: enlarging the at least one cell in order to present all the information contained within the at least one cell; ormodifying a size of all information contained within the at least one cell to enable presentation of all the information contained within the at least one cell without changing dimensions of the at least one cell.
  • 12. The non-transitory computer-readable medium of claim 10, wherein at least one dimension of one or more cells adjacent to the at least one cell remains unchanged.
  • 13. The non-transitory computer-readable medium of claim 10, wherein the enlarged at least one cell includes at least one of an increased height, an expansion in width, or an expansion in both height and width.
  • 14. The non-transitory computer-readable medium of claim 10, wherein, after the enhancing scroll signal terminates, the enlarged at least one cell returns to its original dimensions, orthe size of all information contained within the at least one cell returns to its original size.
  • 15. The non-transitory computer-readable medium of claim 10, wherein the enhancing scroll signal causes a font size to change.
  • 16. The non-transitory computer-readable medium of claim 10, wherein revealing the hidden information further includes: enlarging the at least one cell; andwrapping one or more lines of text contained within the at least one cell.
  • 17. The non-transitory computer-readable medium of claim 1, wherein the operations further include locking the at least one cell in an enhanced form until an unlock signal is received.
  • 18. A system for performing table navigation operations on a display, wherein the table comprises rows and columns and at least one cell that contains more information than it presents, the system comprising: a memory storing instructions; andat least one processor that executes the stored instructions to:receive a scroll signal for scrolling the table, wherein the scroll signal results from a motion of a user on the display;in response to a vertical component in the motion, cause the table to scroll vertically;in response to a horizontal component in the motion, cause the table to scroll horizontally;receive an enhancing scroll signal resulting from an enhancing motion for revealing hidden information within the at least one cell on the table; andin response to the enhancing scroll signal, reveal at least a portion of the hidden information within the at least one cell.
  • 19. The system of claim 18, wherein revealing the hidden information includes at least one of: enlarging the at least one cell in order to present all the information contained within the at least one cell; ormodifying a size of all information contained within the at least one cell to enable presentation of all the information contained within the at least one cell without changing dimensions of the at least one cell.
  • 20. A method for performing table navigation operations on a display, wherein the table comprises rows and columns and at least one cell that contains more information than it presents, the method comprising: receiving a scroll signal for scrolling the table, wherein the scroll signal results from a motion of a user on the display;in response to a vertical component in the motion, causing the table to scroll vertically;in response to a horizontal component in the motion, causing the table to scroll horizontally;receiving an enhancing scroll signal resulting from an enhancing motion for revealing hidden information within the at least one cell on the table; andin response to the enhancing scroll signal, revealing at least a portion of the hidden information within the at least one cell.
US Referenced Citations (793)
Number Name Date Kind
4972314 Getzinger et al. Nov 1990 A
5220657 Bly et al. Jun 1993 A
5479602 Baecker et al. Dec 1995 A
5517663 Kahn May 1996 A
5632009 Rao et al. May 1997 A
5682469 Linnett et al. Oct 1997 A
5696702 Skinner et al. Dec 1997 A
5726701 Needham Mar 1998 A
5787411 Groff et al. Jul 1998 A
5880742 Rao et al. Mar 1999 A
5933145 Meek Aug 1999 A
6016438 Wakayama Jan 2000 A
6016553 Schneider et al. Jan 2000 A
6023695 Osborn et al. Feb 2000 A
6034681 Miller et al. Mar 2000 A
6049622 Robb et al. Apr 2000 A
6088707 Bates et al. Jul 2000 A
6108573 Debbins et al. Aug 2000 A
6111573 McComb et al. Aug 2000 A
6167405 Rosensteel, Jr. et al. Dec 2000 A
6169534 Raffel et al. Jan 2001 B1
6182127 Cronin, III et al. Jan 2001 B1
6185582 Zellweger et al. Feb 2001 B1
6195794 Buxton Feb 2001 B1
6266067 Owen et al. Jul 2001 B1
6275809 Tamaki et al. Aug 2001 B1
6330022 Seligmann Dec 2001 B1
6377965 Hachamovitch et al. Apr 2002 B1
6385617 Malik May 2002 B1
6460043 Tabbara et al. Oct 2002 B1
6496832 Chi et al. Dec 2002 B2
6509912 Moran et al. Jan 2003 B1
6510459 Cronin, III et al. Jan 2003 B2
6522347 Tsuji et al. Feb 2003 B1
6527556 Koskinen Mar 2003 B1
6567830 Madduri May 2003 B1
6606740 Lynn et al. Aug 2003 B1
6636242 Bowman-Amuah Oct 2003 B2
6647370 Fu et al. Nov 2003 B1
6661431 Stuart et al. Dec 2003 B1
6988248 Tang et al. Jan 2006 B1
7027997 Robinson et al. Apr 2006 B1
7034860 Lia et al. Apr 2006 B2
7043529 Simonoff May 2006 B1
7054891 Cole May 2006 B2
7237188 Leung Jun 2007 B1
7249042 Doerr et al. Jul 2007 B1
7272637 Himmelstein Sep 2007 B1
7274375 David Sep 2007 B1
7379934 Forman et al. May 2008 B1
7383320 Silberstein et al. Jun 2008 B1
7389473 Sawicki et al. Jun 2008 B1
7415664 Aureglia et al. Aug 2008 B2
7417644 Cooper et al. Aug 2008 B2
7461077 Greenwood Dec 2008 B1
7489976 Adra Feb 2009 B2
7617443 Mills et al. Nov 2009 B2
7685152 Chivukula et al. Mar 2010 B2
7707514 Forstall et al. Apr 2010 B2
7710290 Johnson May 2010 B2
7770100 Chamberlain et al. Aug 2010 B2
7827476 Roberts et al. Nov 2010 B1
7827615 Allababidi et al. Nov 2010 B1
7916157 Kelley et al. Mar 2011 B1
7921360 Sundermeyer et al. Apr 2011 B1
7933952 Parker et al. Apr 2011 B2
7954064 Forstall et al. May 2011 B2
8046703 Busch et al. Oct 2011 B2
8078955 Gupta Dec 2011 B1
8082274 Steinglass et al. Dec 2011 B2
8108241 Shukoor Jan 2012 B2
8136031 Massand Mar 2012 B2
8151213 Weitzman et al. Apr 2012 B2
8223172 Miller et al. Jul 2012 B1
8286072 Chamberlain et al. Oct 2012 B2
8365095 Bansal et al. Jan 2013 B2
8375327 Lorch et al. Feb 2013 B2
8386960 Eismann et al. Feb 2013 B1
8407217 Zhang Mar 2013 B1
8413261 Nemoy et al. Apr 2013 B2
8423909 Zabielski Apr 2013 B2
8543566 Weissman et al. Sep 2013 B2
8548997 Wu Oct 2013 B1
8560942 Fortes et al. Oct 2013 B2
8566732 Louch et al. Oct 2013 B2
8572173 Briere et al. Oct 2013 B2
8578399 Khen et al. Nov 2013 B2
8601383 Folting et al. Dec 2013 B2
8620703 Kapoor et al. Dec 2013 B1
8621652 Slater, Jr. Dec 2013 B2
8635520 Christiansen et al. Jan 2014 B2
8677448 Kauffman et al. Mar 2014 B1
8738414 Nagar et al. May 2014 B1
8812471 Akita Aug 2014 B2
8819042 Samudrala et al. Aug 2014 B2
8825758 Bailor et al. Sep 2014 B2
8838533 Kwiatkowski et al. Sep 2014 B2
8862979 Hawking Oct 2014 B2
8863022 Rhodes et al. Oct 2014 B2
8869027 Louch et al. Oct 2014 B2
8937627 Otero et al. Jan 2015 B1
8938465 Messer Jan 2015 B2
8954871 Louch Feb 2015 B2
9007405 Eldar Apr 2015 B1
9015716 Fletcher et al. Apr 2015 B2
9026897 Zarras May 2015 B2
9043362 Weissman et al. May 2015 B2
9063958 Müller et al. Jun 2015 B2
9129234 Campbell et al. Sep 2015 B2
9159246 Rodriguez et al. Oct 2015 B2
9172738 daCosta Oct 2015 B1
9183303 Goel et al. Nov 2015 B1
9223770 Ledet Dec 2015 B1
9239719 Feinstein et al. Jan 2016 B1
9244917 Sharma et al. Jan 2016 B1
9253130 Zaveri Feb 2016 B2
9286246 Saito et al. Mar 2016 B2
9286475 Li et al. Mar 2016 B2
9292587 Kann et al. Mar 2016 B2
9336502 Mohammad et al. May 2016 B2
9342579 Cao et al. May 2016 B2
9361287 Simon et al. Jun 2016 B1
9390059 Gur et al. Jul 2016 B1
9424287 Schroth Aug 2016 B2
9424333 Bisignani et al. Aug 2016 B1
9424545 Lee Aug 2016 B1
9430458 Rhee et al. Aug 2016 B2
9449031 Barrus et al. Sep 2016 B2
9495386 Tapley et al. Nov 2016 B2
9519699 Kulkarni et al. Dec 2016 B1
9558172 Rampson et al. Jan 2017 B2
9613086 Sherman Apr 2017 B1
9635091 Laukkanen Apr 2017 B1
9679456 East Jun 2017 B2
9727376 Bills et al. Aug 2017 B1
9760271 Persaud Sep 2017 B2
9794256 Kiang Oct 2017 B2
9798829 Baisley Oct 2017 B1
9811676 Gauvin Nov 2017 B1
9866561 Psenka et al. Jan 2018 B2
9870136 Pourshahid Jan 2018 B2
10043296 Li Aug 2018 B2
10067928 Krappe Sep 2018 B1
10078668 Woodrow et al. Sep 2018 B1
10169306 O'Shaughnessy et al. Jan 2019 B2
10176154 Ben-Aharon et al. Jan 2019 B2
10235441 Makhlin et al. Mar 2019 B1
10255609 Kinkead et al. Apr 2019 B2
10282405 Silk et al. May 2019 B1
10282406 Bissantz May 2019 B2
10311080 Folting et al. Jun 2019 B2
10318624 Rosner et al. Jun 2019 B1
10327712 Beymer et al. Jun 2019 B2
10347017 Ruble et al. Jul 2019 B2
10372706 Chavan et al. Aug 2019 B2
10380140 Sherman Aug 2019 B2
10423758 Kido et al. Sep 2019 B2
10445702 Hunt Oct 2019 B1
10452360 Burman et al. Oct 2019 B1
10453118 Smith et al. Oct 2019 B2
10474317 Ramanathan et al. Nov 2019 B2
10489391 Tomlin Nov 2019 B1
10489462 Rogynskyy et al. Nov 2019 B1
10496737 Sayre et al. Dec 2019 B1
10505825 Bettaiah et al. Dec 2019 B1
10528599 Pandis et al. Jan 2020 B1
10534507 Laukkanen et al. Jan 2020 B1
10540152 Krishnaswamy et al. Jan 2020 B1
10540434 Migeon et al. Jan 2020 B2
10546001 Nguyen et al. Jan 2020 B1
10564622 Dean et al. Feb 2020 B1
10573407 Ginsburg Feb 2020 B2
10579724 Campbell et al. Mar 2020 B2
10587714 Kulkarni et al. Mar 2020 B1
10628002 Kang et al. Apr 2020 B1
10698594 Sanches et al. Jun 2020 B2
10706061 Sherman et al. Jul 2020 B2
10719220 Ouellet et al. Jul 2020 B2
10733256 Fickenscher et al. Aug 2020 B2
10740117 Ording et al. Aug 2020 B2
10747950 Dang et al. Aug 2020 B2
10748312 Ruble et al. Aug 2020 B2
10754688 Powell Aug 2020 B2
10761691 Anzures et al. Sep 2020 B2
10795555 Burke et al. Oct 2020 B2
10817660 Rampson et al. Oct 2020 B2
D910077 Naroshevitch et al. Feb 2021 S
10963578 More et al. Mar 2021 B2
11010371 Slomka et al. May 2021 B1
11030259 Mullins et al. Jun 2021 B2
11042363 Krishnaswamy et al. Jun 2021 B1
11042699 Sayre et al. Jun 2021 B1
11048714 Sherman et al. Jun 2021 B2
11086894 Srivastava et al. Aug 2021 B1
11222167 Gehrmann et al. Jan 2022 B2
11243688 Remy et al. Feb 2022 B1
20010008998 Tamaki et al. Jul 2001 A1
20010032248 Krafchin Oct 2001 A1
20010039551 Saito et al. Nov 2001 A1
20020002459 Lewis et al. Jan 2002 A1
20020065848 Walker et al. May 2002 A1
20020065849 Ferguson et al. May 2002 A1
20020065880 Hasegawa et al. May 2002 A1
20020069207 Alexander et al. Jun 2002 A1
20020075309 Michelman et al. Jun 2002 A1
20020082892 Raffel et al. Jun 2002 A1
20020138528 Gong et al. Sep 2002 A1
20030033196 Tomlin Feb 2003 A1
20030041113 Larsen Feb 2003 A1
20030051377 Chirafesi, Jr. Mar 2003 A1
20030058277 Bowman-Amuah Mar 2003 A1
20030065662 Cosic Apr 2003 A1
20030093408 Brown et al. May 2003 A1
20030101416 McInnes et al. May 2003 A1
20030135558 Bellotti et al. Jul 2003 A1
20030137536 Hugh Jul 2003 A1
20030187864 McGoveran Oct 2003 A1
20030200215 Chen et al. Oct 2003 A1
20030204490 Kasriel Oct 2003 A1
20040032432 Baynger Feb 2004 A1
20040098284 Petito et al. May 2004 A1
20040133441 Brady et al. Jul 2004 A1
20040138939 Theiler Jul 2004 A1
20040139400 Allam et al. Jul 2004 A1
20040162833 Jones et al. Aug 2004 A1
20040172592 Collie et al. Sep 2004 A1
20040212615 Lithe Oct 2004 A1
20040215443 Hatton Oct 2004 A1
20040230940 Cooper et al. Nov 2004 A1
20050034058 Mills et al. Feb 2005 A1
20050034064 Meyers et al. Feb 2005 A1
20050039001 Hudls et al. Feb 2005 A1
20050039033 Meyers et al. Feb 2005 A1
20050044486 Kotler et al. Feb 2005 A1
20050063615 Siegel et al. Mar 2005 A1
20050066306 Diab Mar 2005 A1
20050086360 Mamou et al. Apr 2005 A1
20050091314 Blagsvedt et al. Apr 2005 A1
20050091596 Anthony et al. Apr 2005 A1
20050096973 Heyse et al. May 2005 A1
20050114305 Haynes et al. May 2005 A1
20050125395 Boettiger Jun 2005 A1
20050165600 Kasravi et al. Jul 2005 A1
20050171881 Ghassemieh et al. Aug 2005 A1
20050216830 Turner et al. Sep 2005 A1
20050228250 Bitter et al. Oct 2005 A1
20050251021 Kaufman et al. Nov 2005 A1
20050257204 Bryant et al. Nov 2005 A1
20050278297 Nelson Dec 2005 A1
20050289170 Brown et al. Dec 2005 A1
20050289342 Needham et al. Dec 2005 A1
20050289453 Segal et al. Dec 2005 A1
20060009960 Valencot et al. Jan 2006 A1
20060013462 Sadikali Jan 2006 A1
20060015499 Clissold et al. Jan 2006 A1
20060015806 Wallace Jan 2006 A1
20060031148 O'Dell et al. Feb 2006 A1
20060031764 Keyser et al. Feb 2006 A1
20060036568 Moore et al. Feb 2006 A1
20060047811 Lau et al. Mar 2006 A1
20060053096 Subramanian et al. Mar 2006 A1
20060053194 Schneider et al. Mar 2006 A1
20060069604 Leukart et al. Mar 2006 A1
20060069635 Ram et al. Mar 2006 A1
20060080594 Chavoustie et al. Apr 2006 A1
20060090169 Daniels et al. Apr 2006 A1
20060106642 Reicher et al. May 2006 A1
20060107196 Thanu et al. May 2006 A1
20060111953 Setya May 2006 A1
20060129415 Thukral et al. Jun 2006 A1
20060136828 Asano Jun 2006 A1
20060150090 Swamidass Jul 2006 A1
20060173908 Browning et al. Aug 2006 A1
20060190313 Lu Aug 2006 A1
20060212299 Law Sep 2006 A1
20060224542 Yalamanchi Oct 2006 A1
20060224568 Debrito Oct 2006 A1
20060224946 Barrett et al. Oct 2006 A1
20060236246 Bono et al. Oct 2006 A1
20060250369 Keim Nov 2006 A1
20060253205 Gardiner Nov 2006 A1
20060271574 Villaron et al. Nov 2006 A1
20060287998 Folting et al. Dec 2006 A1
20060294451 Kelkar et al. Dec 2006 A1
20070027932 Thibeault Feb 2007 A1
20070033531 Marsh Feb 2007 A1
20070050322 Vigesaa et al. Mar 2007 A1
20070050379 Day et al. Mar 2007 A1
20070073899 Judge et al. Mar 2007 A1
20070092048 Chelstrom et al. Apr 2007 A1
20070094607 Morgan et al. Apr 2007 A1
20070101291 Forstall et al. May 2007 A1
20070106754 Moore May 2007 A1
20070118527 Winje et al. May 2007 A1
20070118813 Forstall et al. May 2007 A1
20070143169 Grant et al. Jun 2007 A1
20070168861 Bell et al. Jul 2007 A1
20070174228 Folting et al. Jul 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070186173 Both et al. Aug 2007 A1
20070220119 Himmelstein Sep 2007 A1
20070233647 Rawat et al. Oct 2007 A1
20070256043 Peters et al. Nov 2007 A1
20070282522 Geelen Dec 2007 A1
20070282627 Greenstein et al. Dec 2007 A1
20070283259 Barry et al. Dec 2007 A1
20070294235 Millett Dec 2007 A1
20070299795 Macbeth et al. Dec 2007 A1
20070300174 Macbeth et al. Dec 2007 A1
20070300185 Macbeth et al. Dec 2007 A1
20080004929 Raffel et al. Jan 2008 A9
20080005235 Hegde et al. Jan 2008 A1
20080033777 Shukoor Feb 2008 A1
20080034307 Cisler et al. Feb 2008 A1
20080034314 Louch et al. Feb 2008 A1
20080052291 Bender Feb 2008 A1
20080059312 Gern et al. Mar 2008 A1
20080059539 Chin et al. Mar 2008 A1
20080065460 Raynor Mar 2008 A1
20080077530 Banas et al. Mar 2008 A1
20080097748 Haley et al. Apr 2008 A1
20080104091 Chin May 2008 A1
20080126389 Mush et al. May 2008 A1
20080133736 Wensley et al. Jun 2008 A1
20080148140 Nakano Jun 2008 A1
20080155547 Weber et al. Jun 2008 A1
20080163075 Beck et al. Jul 2008 A1
20080183593 Dierks Jul 2008 A1
20080195948 Bauer Aug 2008 A1
20080209318 Allsop et al. Aug 2008 A1
20080216022 Lorch et al. Sep 2008 A1
20080222192 Hughes Sep 2008 A1
20080256014 Gould et al. Oct 2008 A1
20080256429 Penner et al. Oct 2008 A1
20080270597 Tenenti Oct 2008 A1
20080282189 Hofmann et al. Nov 2008 A1
20080295038 Helfman et al. Nov 2008 A1
20080301237 Parsons Dec 2008 A1
20090006171 Blatchley et al. Jan 2009 A1
20090006283 Labrie et al. Jan 2009 A1
20090013244 Cudich et al. Jan 2009 A1
20090019383 Riley et al. Jan 2009 A1
20090024944 Louch et al. Jan 2009 A1
20090048896 Anandan Feb 2009 A1
20090049372 Goldberg Feb 2009 A1
20090077164 Phillips et al. Mar 2009 A1
20090077217 McFarland et al. Mar 2009 A1
20090083140 Phan Mar 2009 A1
20090094514 Dargahi et al. Apr 2009 A1
20090113310 Appleyard et al. Apr 2009 A1
20090132470 Vignet May 2009 A1
20090150813 Chang et al. Jun 2009 A1
20090174680 Anzures et al. Jul 2009 A1
20090192787 Roon Jul 2009 A1
20090198715 Barbarek Aug 2009 A1
20090222760 Halverson et al. Sep 2009 A1
20090248710 McCormack et al. Oct 2009 A1
20090271696 Bailor et al. Oct 2009 A1
20090276692 Rosner Nov 2009 A1
20090313201 Huelsman et al. Dec 2009 A1
20090313537 Fu et al. Dec 2009 A1
20090313570 Po et al. Dec 2009 A1
20090319623 Srinivasan et al. Dec 2009 A1
20090319882 Morrison et al. Dec 2009 A1
20090327240 Meehan et al. Dec 2009 A1
20090327301 Lees et al. Dec 2009 A1
20090327851 Raposo Dec 2009 A1
20090327875 Kinkoh Dec 2009 A1
20100017699 Farrell et al. Jan 2010 A1
20100031135 Naghshin et al. Feb 2010 A1
20100070845 Facemire et al. Mar 2010 A1
20100070895 Messer Mar 2010 A1
20100083164 Martin et al. Apr 2010 A1
20100088636 Yerkes et al. Apr 2010 A1
20100095219 Stachowiak et al. Apr 2010 A1
20100095298 Seshadrinathan et al. Apr 2010 A1
20100100427 McKeown et al. Apr 2010 A1
20100100463 Molotsi et al. Apr 2010 A1
20100114926 Agrawal et al. May 2010 A1
20100149005 Yoon et al. Jun 2010 A1
20100174678 Massand Jul 2010 A1
20100228752 Folting et al. Sep 2010 A1
20100241477 Nylander et al. Sep 2010 A1
20100241948 Andeen et al. Sep 2010 A1
20100241972 Spataro et al. Sep 2010 A1
20100241990 Gabriel et al. Sep 2010 A1
20100251090 Chamberlain et al. Sep 2010 A1
20100251386 Gilzean et al. Sep 2010 A1
20100257015 Molander Oct 2010 A1
20100262625 Pittenger Oct 2010 A1
20100287163 Sridhar et al. Nov 2010 A1
20100287221 Battepati et al. Nov 2010 A1
20100313119 Baldwin et al. Dec 2010 A1
20100324964 Callanan et al. Dec 2010 A1
20100332973 Kloiber et al. Dec 2010 A1
20110010340 Hung et al. Jan 2011 A1
20110016432 Helfman Jan 2011 A1
20110028138 Davies-Moore et al. Feb 2011 A1
20110047484 Mount et al. Feb 2011 A1
20110055177 Chakra et al. Mar 2011 A1
20110066933 Ludwig Mar 2011 A1
20110071869 O'Brien et al. Mar 2011 A1
20110106636 Spear et al. May 2011 A1
20110119352 Perov et al. May 2011 A1
20110179371 Kopycinski et al. Jul 2011 A1
20110205231 Hartley et al. Aug 2011 A1
20110208324 Fukatsu Aug 2011 A1
20110208732 Melton et al. Aug 2011 A1
20110209150 Hammond et al. Aug 2011 A1
20110219321 Gonzalez Veron et al. Sep 2011 A1
20110225525 Chasman et al. Sep 2011 A1
20110231273 Buchheit Sep 2011 A1
20110289397 Eastmond et al. Nov 2011 A1
20110289439 Jugel Nov 2011 A1
20110298618 Stahl et al. Dec 2011 A1
20110302003 Shirish et al. Dec 2011 A1
20120029962 Podgurny et al. Feb 2012 A1
20120035974 Seybold Feb 2012 A1
20120036462 Schwartz et al. Feb 2012 A1
20120066587 Zhou et al. Mar 2012 A1
20120072821 Bowling Mar 2012 A1
20120079408 Rohwer Mar 2012 A1
20120081762 Yamada Apr 2012 A1
20120084798 Reeves et al. Apr 2012 A1
20120086716 Reeves et al. Apr 2012 A1
20120086717 Liu Apr 2012 A1
20120089610 Agrawal et al. Apr 2012 A1
20120089914 Holt et al. Apr 2012 A1
20120089992 Reeves et al. Apr 2012 A1
20120096389 Flam et al. Apr 2012 A1
20120096392 Ording et al. Apr 2012 A1
20120102432 Breedvelt-Schouten et al. Apr 2012 A1
20120102543 Kohli et al. Apr 2012 A1
20120110515 Abramoff et al. May 2012 A1
20120116834 Pope et al. May 2012 A1
20120116835 Pope et al. May 2012 A1
20120131445 Oyarzabal et al. May 2012 A1
20120151173 Shirley et al. Jun 2012 A1
20120158744 Tseng et al. Jun 2012 A1
20120192050 Campbell et al. Jul 2012 A1
20120198322 Gulwani et al. Aug 2012 A1
20120210252 Fedoseyeva et al. Aug 2012 A1
20120215574 Driessnack et al. Aug 2012 A1
20120215578 Swierz, III et al. Aug 2012 A1
20120233150 Naim et al. Sep 2012 A1
20120233533 Yücel et al. Sep 2012 A1
20120246170 Lantorno Sep 2012 A1
20120254252 Jin et al. Oct 2012 A1
20120254770 Ophir Oct 2012 A1
20120260190 Berger et al. Oct 2012 A1
20120278117 Nguyen et al. Nov 2012 A1
20120284197 Strick et al. Nov 2012 A1
20120297307 Rider et al. Nov 2012 A1
20120303262 Alam et al. Nov 2012 A1
20120304098 Kuulusa Nov 2012 A1
20120311496 Cao et al. Dec 2012 A1
20120311672 Connor et al. Dec 2012 A1
20130018952 McConnell et al. Jan 2013 A1
20130018953 McConnell et al. Jan 2013 A1
20130018960 Knysz et al. Jan 2013 A1
20130024418 Strick et al. Jan 2013 A1
20130024760 Vogel et al. Jan 2013 A1
20130036369 Mitchell et al. Feb 2013 A1
20130041958 Post et al. Feb 2013 A1
20130054514 Barrett-Kahn et al. Feb 2013 A1
20130055113 Chazin et al. Feb 2013 A1
20130063490 Zaman et al. Mar 2013 A1
20130086460 Folting et al. Apr 2013 A1
20130090969 Rivere Apr 2013 A1
20130097490 Kotler et al. Apr 2013 A1
20130103417 Seto et al. Apr 2013 A1
20130104035 Wagner et al. Apr 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117268 Smith et al. May 2013 A1
20130159832 Ingargiola et al. Jun 2013 A1
20130159907 Brosche et al. Jun 2013 A1
20130179209 Milosevich Jul 2013 A1
20130211866 Gordon et al. Aug 2013 A1
20130212197 Karlson Aug 2013 A1
20130212234 Bartlett et al. Aug 2013 A1
20130238363 Ohta et al. Sep 2013 A1
20130238968 Barrus Sep 2013 A1
20130246384 Victor Sep 2013 A1
20130262527 Hunter Oct 2013 A1
20130268331 Bitz et al. Oct 2013 A1
20130297468 Hirsch et al. Nov 2013 A1
20130318424 Boyd Nov 2013 A1
20130339051 Dobrean Dec 2013 A1
20140006326 Bazanov Jan 2014 A1
20140019842 Montagna et al. Jan 2014 A1
20140033307 Schmidtler Jan 2014 A1
20140043331 Makinen et al. Feb 2014 A1
20140046638 Peloski Feb 2014 A1
20140052749 Rissanen Feb 2014 A1
20140058801 Deodhar et al. Feb 2014 A1
20140068403 Bhargav et al. Mar 2014 A1
20140074545 Minder et al. Mar 2014 A1
20140075301 Mihara Mar 2014 A1
20140082525 Kass et al. Mar 2014 A1
20140101527 Suciu Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140109012 Choudhary et al. Apr 2014 A1
20140111516 Hall et al. Apr 2014 A1
20140115515 Adams et al. Apr 2014 A1
20140115518 Abdukalykov et al. Apr 2014 A1
20140129960 Wang et al. May 2014 A1
20140136972 Rodgers et al. May 2014 A1
20140137003 Peters et al. May 2014 A1
20140137144 Järvenpää et al. May 2014 A1
20140172475 Olliphant et al. Jun 2014 A1
20140173401 Oshlag et al. Jun 2014 A1
20140181155 Homsany Jun 2014 A1
20140188748 Cavoue et al. Jul 2014 A1
20140195933 Rao Dv Jul 2014 A1
20140214404 Kalia et al. Jul 2014 A1
20140215303 Grigorovitch et al. Jul 2014 A1
20140249877 Hull et al. Sep 2014 A1
20140278638 Kreuzkamp et al. Sep 2014 A1
20140278720 Taguchi Sep 2014 A1
20140280287 Ganti et al. Sep 2014 A1
20140280377 Frew Sep 2014 A1
20140281868 Vogel et al. Sep 2014 A1
20140281869 Yob Sep 2014 A1
20140289223 Colwell et al. Sep 2014 A1
20140304174 Scott et al. Oct 2014 A1
20140306837 Hauck, III Oct 2014 A1
20140310345 Megiddo et al. Oct 2014 A1
20140324497 Verma et al. Oct 2014 A1
20140324501 Davidow et al. Oct 2014 A1
20140365938 Black et al. Dec 2014 A1
20140372856 Radakovitz et al. Dec 2014 A1
20140372932 Rutherford et al. Dec 2014 A1
20150032686 Kuchoor Jan 2015 A1
20150033131 Peev et al. Jan 2015 A1
20150033149 Kuchoor Jan 2015 A1
20150035918 Matsumoto et al. Feb 2015 A1
20150067556 Tibrewal et al. Mar 2015 A1
20150074721 Fishman et al. Mar 2015 A1
20150074728 Chai et al. Mar 2015 A1
20150088822 Raja et al. Mar 2015 A1
20150095752 Studer et al. Apr 2015 A1
20150106736 Torman et al. Apr 2015 A1
20150125834 Mendoza May 2015 A1
20150142676 McGinnis et al. May 2015 A1
20150142829 Lee et al. May 2015 A1
20150153943 Wang Jun 2015 A1
20150154660 Weald et al. Jun 2015 A1
20150169514 Sah et al. Jun 2015 A1
20150169531 Campbell et al. Jun 2015 A1
20150188964 Sharma et al. Jul 2015 A1
20150212717 Nair et al. Jul 2015 A1
20150220491 Cochrane et al. Aug 2015 A1
20150242091 Lu et al. Aug 2015 A1
20150249864 Tang et al. Sep 2015 A1
20150261796 Gould et al. Sep 2015 A1
20150278699 Danielsson Oct 2015 A1
20150281292 Murayama et al. Oct 2015 A1
20150295877 Roman Oct 2015 A1
20150317590 Karlson Nov 2015 A1
20150324453 Werner Nov 2015 A1
20150331846 Guggilla et al. Nov 2015 A1
20150363478 Haynes Dec 2015 A1
20150370540 Coslovi et al. Dec 2015 A1
20150370904 Joshi et al. Dec 2015 A1
20150378542 Saito et al. Dec 2015 A1
20150378711 Cameron et al. Dec 2015 A1
20150378979 Hirzel et al. Dec 2015 A1
20150379472 Gilmour et al. Dec 2015 A1
20160012111 Pattabhiraman et al. Jan 2016 A1
20160018962 Low et al. Jan 2016 A1
20160026939 Schiffer et al. Jan 2016 A1
20160027076 Jackson et al. Jan 2016 A1
20160055134 Sathish et al. Feb 2016 A1
20160055374 Zhang et al. Feb 2016 A1
20160063435 Shah et al. Mar 2016 A1
20160068960 Jung et al. Mar 2016 A1
20160078368 Kakhandiki et al. Mar 2016 A1
20160088480 Chen et al. Mar 2016 A1
20160092557 Stojanovic et al. Mar 2016 A1
20160117308 Haider et al. Apr 2016 A1
20160170586 Gallo Jun 2016 A1
20160173122 Akitomi et al. Jun 2016 A1
20160210572 Shaaban et al. Jul 2016 A1
20160224532 Miller et al. Aug 2016 A1
20160231915 Nhan et al. Aug 2016 A1
20160232489 Skaaksrud Aug 2016 A1
20160246490 Cabral Aug 2016 A1
20160253982 Cheung et al. Sep 2016 A1
20160259856 Ananthapur et al. Sep 2016 A1
20160275150 Bournonnais et al. Sep 2016 A1
20160299655 Migos et al. Oct 2016 A1
20160308963 Kung Oct 2016 A1
20160321235 He et al. Nov 2016 A1
20160321604 Imaeda et al. Nov 2016 A1
20160335302 Teodorescu et al. Nov 2016 A1
20160335303 Madhalam et al. Nov 2016 A1
20160335731 Hall Nov 2016 A1
20160335903 Mendoza Nov 2016 A1
20160344828 Häusler et al. Nov 2016 A1
20160350950 Ritchie et al. Dec 2016 A1
20160381099 Keslin et al. Dec 2016 A1
20170017779 Huang et al. Jan 2017 A1
20170031967 Chavan et al. Feb 2017 A1
20170041296 Ford et al. Feb 2017 A1
20170052937 Sirven et al. Feb 2017 A1
20170061342 Lore et al. Mar 2017 A1
20170061360 Rucker et al. Mar 2017 A1
20170061820 Firoozbakhsh Mar 2017 A1
20170063722 Cropper et al. Mar 2017 A1
20170075557 Noble et al. Mar 2017 A1
20170076101 Kochhar et al. Mar 2017 A1
20170090734 Fitzpatrick Mar 2017 A1
20170090736 King et al. Mar 2017 A1
20170091337 Patterson Mar 2017 A1
20170093876 Feng et al. Mar 2017 A1
20170109499 Doshi et al. Apr 2017 A1
20170111327 Wu Apr 2017 A1
20170116552 Deodhar et al. Apr 2017 A1
20170124042 Campbell et al. May 2017 A1
20170124048 Campbell et al. May 2017 A1
20170124055 Radakovitz et al. May 2017 A1
20170124740 Campbell et al. May 2017 A1
20170126772 Campbell et al. May 2017 A1
20170132296 Ding May 2017 A1
20170132652 Kedzlie et al. May 2017 A1
20170139874 Chin May 2017 A1
20170139884 Bendig et al. May 2017 A1
20170139891 Ah-Soon et al. May 2017 A1
20170140047 Bendig et al. May 2017 A1
20170140219 King et al. May 2017 A1
20170153771 Chu Jun 2017 A1
20170161246 Klima Jun 2017 A1
20170177888 Arora et al. Jun 2017 A1
20170185668 Convertino et al. Jun 2017 A1
20170200122 Edson et al. Jul 2017 A1
20170206366 Fay et al. Jul 2017 A1
20170212924 Semlani et al. Jul 2017 A1
20170220813 Mullins et al. Aug 2017 A1
20170221072 AthuluruTlrumala et al. Aug 2017 A1
20170228445 Chiu et al. Aug 2017 A1
20170228460 Amel et al. Aug 2017 A1
20170236081 Grady Smith et al. Aug 2017 A1
20170242921 Rota Aug 2017 A1
20170262786 Khasis Sep 2017 A1
20170270970 Ho et al. Sep 2017 A1
20170272316 Johnson et al. Sep 2017 A1
20170272331 Lissack Sep 2017 A1
20170277669 Sekharan Sep 2017 A1
20170285879 Pilkington et al. Oct 2017 A1
20170285890 Dolman Oct 2017 A1
20170301039 Dyer et al. Oct 2017 A1
20170315683 Boucher et al. Nov 2017 A1
20170315974 Kong et al. Nov 2017 A1
20170315979 Boucher et al. Nov 2017 A1
20170324692 Zhou Nov 2017 A1
20170351252 Kleifges et al. Dec 2017 A1
20170372442 Mejias Dec 2017 A1
20180011827 Avery et al. Jan 2018 A1
20180025084 Conlan et al. Jan 2018 A1
20180026954 Toepke et al. Jan 2018 A1
20180032492 Altshuller et al. Feb 2018 A1
20180032570 Miller et al. Feb 2018 A1
20180055434 Cheung et al. Mar 2018 A1
20180075104 Oberbreckling et al. Mar 2018 A1
20180075115 Murray et al. Mar 2018 A1
20180075413 Culver et al. Mar 2018 A1
20180075560 Thukral et al. Mar 2018 A1
20180081863 Bathla Mar 2018 A1
20180081868 Willcock et al. Mar 2018 A1
20180088753 Viégas et al. Mar 2018 A1
20180088989 Nield et al. Mar 2018 A1
20180089299 Collins et al. Mar 2018 A1
20180095938 Monte Apr 2018 A1
20180096417 Cook et al. Apr 2018 A1
20180109760 Metter et al. Apr 2018 A1
20180121994 Matsunaga et al. May 2018 A1
20180128636 Zhou May 2018 A1
20180129651 Latvala et al. May 2018 A1
20180157455 Troy et al. Jun 2018 A1
20180157467 Stachura Jun 2018 A1
20180157468 Stachura Jun 2018 A1
20180157633 He et al. Jun 2018 A1
20180173715 Dunne Jun 2018 A1
20180181650 Komatsuda et al. Jun 2018 A1
20180181716 Mander et al. Jun 2018 A1
20180210936 Reynolds et al. Jul 2018 A1
20180225270 Bhide et al. Aug 2018 A1
20180260371 Theodore et al. Sep 2018 A1
20180276417 Cerezo Sep 2018 A1
20180293217 Callaghan Oct 2018 A1
20180293669 Jackson et al. Oct 2018 A1
20180329930 Eberlein et al. Nov 2018 A1
20180330320 Kohli Nov 2018 A1
20180357305 Kinast et al. Dec 2018 A1
20180365429 Segal Dec 2018 A1
20180367484 Rodriguez et al. Dec 2018 A1
20180373434 Switzer et al. Dec 2018 A1
20180373757 Schukovets et al. Dec 2018 A1
20190005094 Yi et al. Jan 2019 A1
20190012342 Cohn Jan 2019 A1
20190036989 Eirinberg et al. Jan 2019 A1
20190042628 Rajpara Feb 2019 A1
20190050445 Griffith et al. Feb 2019 A1
20190050812 Boileau Feb 2019 A1
20190056856 Simmons et al. Feb 2019 A1
20190065545 Hazel et al. Feb 2019 A1
20190068703 Vora et al. Feb 2019 A1
20190073350 Shiotani Mar 2019 A1
20190095413 Davis et al. Mar 2019 A1
20190108046 Spencer-Harper et al. Apr 2019 A1
20190113935 Kuo et al. Apr 2019 A1
20190114308 Hancock Apr 2019 A1
20190123924 Embiricos et al. Apr 2019 A1
20190130611 Black et al. May 2019 A1
20190138588 Silk et al. May 2019 A1
20190138653 Roller et al. May 2019 A1
20190155821 Dirisala May 2019 A1
20190179501 Seeley et al. Jun 2019 A1
20190208058 Dvorkin et al. Jul 2019 A1
20190220161 Loftus Jul 2019 A1
20190236188 McKenna Aug 2019 A1
20190243879 Harley et al. Aug 2019 A1
20190251884 Burns et al. Aug 2019 A1
20190258461 Li et al. Aug 2019 A1
20190258706 Li et al. Aug 2019 A1
20190286839 Mutha et al. Sep 2019 A1
20190306009 Makovsky et al. Oct 2019 A1
20190324840 Malamut et al. Oct 2019 A1
20190325012 Delaney et al. Oct 2019 A1
20190347077 Huebra Nov 2019 A1
20190361879 Rogynskyy et al. Nov 2019 A1
20190361971 Zenger et al. Nov 2019 A1
20190364009 Joseph et al. Nov 2019 A1
20190371442 Schoenberg Dec 2019 A1
20190391707 Ristow et al. Dec 2019 A1
20200005248 Gerzi et al. Jan 2020 A1
20200005295 Murphy Jan 2020 A1
20200012629 Lereya et al. Jan 2020 A1
20200019548 Agnew et al. Jan 2020 A1
20200019595 Azua Jan 2020 A1
20200026397 Wohlstadter et al. Jan 2020 A1
20200042648 Rao Feb 2020 A1
20200050696 Mowatt et al. Feb 2020 A1
20200053176 Jimenez et al. Feb 2020 A1
20200125574 Ghoshal et al. Apr 2020 A1
20200134002 Tung et al. Apr 2020 A1
20200142546 Breedvelt-Schouten et al. May 2020 A1
20200151630 Shakhnovich May 2020 A1
20200159558 Bak et al. May 2020 A1
20200175094 Palmer Jun 2020 A1
20200192785 Chen Jun 2020 A1
20200247661 Rao et al. Aug 2020 A1
20200265112 Fox et al. Aug 2020 A1
20200279315 Manggala Sep 2020 A1
20200293616 Nelson et al. Sep 2020 A1
20200301678 Burman et al. Sep 2020 A1
20200301902 Maloy et al. Sep 2020 A1
20200310835 Momchilov Oct 2020 A1
20200327244 Blass et al. Oct 2020 A1
20200334019 Bosworth et al. Oct 2020 A1
20200348809 Drescher Nov 2020 A1
20200349320 Owens Nov 2020 A1
20200356740 Principato Nov 2020 A1
20200356873 Nawrocke et al. Nov 2020 A1
20200380212 Butler et al. Dec 2020 A1
20200380449 Choi Dec 2020 A1
20200387664 Kusumura et al. Dec 2020 A1
20200401581 Eubank et al. Dec 2020 A1
20210014136 Rath Jan 2021 A1
20210019287 Prasad et al. Jan 2021 A1
20210021603 Gibbons Jan 2021 A1
20210034058 Subramanian et al. Feb 2021 A1
20210042796 Khoury et al. Feb 2021 A1
20210049555 Shor Feb 2021 A1
20210055955 Yankelevich et al. Feb 2021 A1
20210056509 Lindy Feb 2021 A1
20210072883 Migunova et al. Mar 2021 A1
20210073526 Zeng et al. Mar 2021 A1
20210084120 Fisher et al. Mar 2021 A1
20210124749 Suzuki et al. Apr 2021 A1
20210124872 Lereya Apr 2021 A1
20210136027 Barbitta et al. May 2021 A1
20210149553 Lereya et al. May 2021 A1
20210150489 Haramati et al. May 2021 A1
20210165782 Deshpande et al. Jun 2021 A1
20210166196 Lereya et al. Jun 2021 A1
20210166339 Mann et al. Jun 2021 A1
20210173682 Chakraborti et al. Jun 2021 A1
20210174006 Stokes Jun 2021 A1
20210192126 Gehrmann et al. Jun 2021 A1
20210264220 Wei et al. Aug 2021 A1
20210326519 Lin et al. Oct 2021 A1
20220221591 Smith et al. Jul 2022 A1
Foreign Referenced Citations (18)
Number Date Country
2828011 Sep 2012 CA
103064833 Apr 2013 CN
107123424 Sep 2017 CN
107422666 Dec 2017 CN
107623596 Jan 2018 CN
107885656 Apr 2018 CN
112929172 Jun 2021 CN
3443466 Dec 2021 EP
20150100760 Sep 2015 KR
WO 2004100015 Nov 2004 WO
WO 2006116580 Nov 2006 WO
WO 2008109541 Sep 2008 WO
WO 2017202159 Nov 2017 WO
WO 2020187408 Sep 2020 WO
WO 2021096944 May 2021 WO
WO 2021144656 Jul 2021 WO
WO 2021161104 Aug 2021 WO
WO 2021220058 Nov 2021 WO
Non-Patent Literature Citations (27)
Entry
D'Elessio et al., Monday.com Walkthrough 2018\All Features, Platforms & Thoughts, Mar. 1, 2018, pp. 1-55, 2018.
Rordigo et al., Project Management with Monday.com: a 101 Introduction; Jul. 22, 2019, pp. 1-21, 2019.
International Search Report and Written Opinion of the International Searching Authority in PCT/IB2020/000658, dated Nov. 11, 2020 (12 pages).
International Search Report in PCT/IB2020/000974, dated May 3, 2021 (19 pages).
International Search Report in PCT/1B2021/000090 dated Jul. 27, 2021.
ShowMyPC, “Switch Presenter While Using ShowMyPC”; web archive.org; Aug. 20, 2016.
International Search Report and Written Opinion of the International Search Authority in PCT/1B2020/000024, dated May 3, 2021 (13 pages).
“Pivottable—Wikipedia”; URL: https://en.wikepedia .org/w/index.php?title=Pivot_table&oldid=857163289, originally retrieve on Oct. 23, 2019; retrieved on Jul. 16, 2021.
Vishal Singh, “A Theoretical Framework of a BIM-based Multi-Disciplinary Collaboration Platform”, Nov. 5, 2020, Automation in Construction, 20 (2011), pp. 134-144 (Year: 2011).
Edward A. Stohr, Workflow Automation: Overview and Research Issues, 2001, Information Systems Frontiers 3:3, pp. 281-296 (Year: 2001).
International Search Report and Written Opinion of the International Search Authority in PCT/1B2021/000297, dated Oct. 12, 2021 (20 pages).
Dapulse.com “features”.extracted from web.archive.or/web/2014091818421/https://dapulse.com/features; Sep. 2014 (Year: 2014).
Stephen Larson et al., Introducing Data Mining Concepts Using Microsoft Excel's Table Analysis Tools, Oct. 2015, [Retrieved on Nov. 19, 2021], Retrieved from the internet: <URL: https://dl.acm.org/doi/pdf/10.5555/2831373.2831394> 3 Pages (127-129) (Year: 2015).
Isaiah Pinchas et al., Lexical Analysis Tool, May 2004, [Retrieved on Nov. 19, 2021], Retrieved from the internet: <URL: https:// dl.acm.org/doi/pdf/10.1145/997140.997147> 9 Pages (66-74) (Year: 2004).
Sajjad Bahrebar et al., “A Novel Type-2 Fuzzy Logic for Improved Risk Analysis of Proton Exchange Membrane Fuel Cells in Marine Power Systems Application”, Energies, 11, 721, pp. 1-16, Mar. 22, 2018.
Pedersen et al., “Tivoli: an electronic whiteboard for informal workgroup meetings”, Conference on Human Factors in Computing Systems: Proceedings of the INTERACT '93 and CHI '93 conference on Human factors in computing systems; Apr. 24-29, 1993:391-398. (Year 1993).
Kollmann, Franz, “Realizing Fine-Granular Read and Write Rights on Tree Structured Documents.” in the Second International Conference on Availability, Reliability and Security (ARES'07), pp. 517-523. IEEE, 2007. (Year: 2007).
Baarslag, “Negotiation as an Interaction Mechanism for Deciding App Permissions.” In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2012-2019. 2016 (Year: 2016).
Peltier, “Clustered and Stacked Column and Bar Charts”, Aug. 2011, Peltier Technical Services, Inc., pp. 1-128; (Year: 2011).
Beate List, “An Evaluation of Conceptual Business Process Modelling Languages”, 2006, SAC'06, Apr. 23-27, pp. 1532-1539 (Year: 2006).
“Demonstracion en espanol de Monday.com”, published Feb. 20, 2019. https://www.youtube.com/watch?v=z0qydTgof1A (Year: 2019).
Desmedt, Yvo, and Arash Shaghaghi, “Function-Based Access Control (FBAC) From Access Control Matrix to Access Control Tensor.” In Proceedings of the 8th ACM CCS International Workshop on Managing Insider Security Threats, pp. 89-92. (2016).
Anupam, V., et al., “Personalizing the Web Using Site Descriptions”, Proceedings of the Tenth International Workshop on Database and Expert Systems Applications, ISBN: 0-7695-0281-4, DOI: 10.1109/DEXA.1999.795275, Jan. 1, 1999, pp. 732-738. (Year: 1999).
Gutwin, C. et al., “Supporting Informal Collaboration in Shared-Workspace Groupware”, J. Univers. Comput. Sci., 14(9), 1411-1434 (2008).
Barai, S., et al., “Image Annotation System Using Visual and Textual Features”, In: Proceedings of the 16th International Conference on Distributed Multi-media Systems, pp. 289-296 (2010).
B. Ionescu, C. Gadea, B. Solomon, M. Trifan, D. Ionescu and V. Stoicu-Tivadar, “Achat-centric collaborative environment for web-based real-time collaboration,” 2015 IEEE 10th Jubilee International Symposium on Applied Computational Intelligence and Informatics, Timisoara, Romania, 2015, pp. 105-110 (Year: 2015).
Susanne Hupfer, Li-Te Cheng, Steven Ross, and John Patterson. 2004. Introducing collaboration into an application development environment. In Proceedings of the 2004 ACM conference on Computer supported cooperative work (CSCW '04). Association for Computing Machinery, New York, NY, USA, 21-24 (Year: 2004).