Digital processing systems and methods for embedded live application in-line in a word processing document in collaborative work systems

Information

  • Patent Grant
  • 11893213
  • Patent Number
    11,893,213
  • Date Filed
    Thursday, December 30, 2021
    3 years ago
  • Date Issued
    Tuesday, February 6, 2024
    11 months ago
Abstract
Systems, methods, and computer-readable media for causing dynamic activity in an electronic word processing document are disclosed. The systems and methods may involve accessing an electronic word processing document; presenting an interface enabling selection of a live application for embedding in the electronic word processing document; embedding, in-line with text of the electronic word processing document, a live active icon representative of the live application; presenting, in a first viewing mode, the live active icon; receiving a selection of the live active icon; in response to the selection, presenting in a second viewing mode, an expanded view of the live application; receiving a collapse instruction; and in response to the collapse instruction, reverting from the second viewing mode to the first viewing mode.
Description
TECHNICAL FIELD

Embodiments consistent with the present disclosure include systems and methods for collaborative work systems. The disclosed systems and methods may be implemented using a combination of conventional hardware and software as well as specialized hardware and software, such as a machine constructed and/or programmed specifically for performing functions associated with the disclosed method steps. Consistent with other disclosed embodiments, non-transitory computer-readable storage media may store program instructions, which may be executable by at least one processing device and perform any of the steps and/or methods described herein.


BACKGROUND

Operation of modern enterprises can be complicated and time consuming. In many cases, managing the operation of a single project requires integration of several employees, departments, and other resources of the entity. To manage the challenging operation, project management software applications may be used. Such software applications allow a user to organize, plan, and manage resources by providing project-related information in order to optimize the time and resources spent on each project. It would be useful to improve these software applications to increase operation management efficiency.


SUMMARY

One aspect of the present disclosure may be directed to systems, methods, and computer readable media for causing dynamic activity in an electronic word processing document. The system may include at least one processor configured to: access an electronic word processing document; present an interface enabling selection of a live application, outside the electronic word processing document, for embedding in the electronic word processing document; embed, in-line with text of the electronic word processing document, a live active icon representative of the live application; present, in a first viewing mode, the live active icon wherein during the first viewing mode, the live active icon may be displayed embedded in-line with the text, and the live active icon dynamically changes based on occurrences outside the electronic word processing document; receive a selection of the live active icon; in response to the selection, present in a second viewing mode, an expanded view of the live application; receive a collapse instruction; and in response to the collapse instruction, revert from the second viewing mode to the first viewing mode.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of an exemplary computing device which may be employed in connection with embodiments of the present disclosure.



FIG. 2 is a block diagram of an exemplary computing architecture for collaborative work systems, consistent with embodiments of the present disclosure.



FIG. 3 illustrates an example of an electronic collaborative word processing document, consistent with some embodiments of the present disclosure.



FIG. 4 illustrates an example of an electronic word processing document, consistent with some embodiments of the present disclosure.



FIG. 5 illustrates an example of an interface enabling selection of a live application, consistent with some embodiments of the present disclosure.



FIG. 6 illustrates an example of an electronic word processing document with embedded live active icons in-line with the text, consistent with some embodiments of the present disclosure.



FIG. 7A illustrates an example of a live active icon in a first viewing mode, consistent with some embodiments of the present disclosure.



FIG. 7B illustrates an example of a live active icon that has dynamically changed based on occurrences outside the electronic word processing document, consistent with some embodiments of the present disclosure.



FIG. 8A illustrates an example of a live active icon in a first viewing mode, consistent with some embodiments of the present disclosure.



FIG. 8B illustrates an example of a live active icon with an animation that plays in-line with the text during the first viewing mode, consistent with some embodiments of the present disclosure.



FIG. 9A illustrates an example of receiving a selection of the live active icon, consistent with some embodiments of the present disclosure.



FIG. 9B illustrates a second viewing mode, an expanded view of the live application consistent with some embodiments of the present disclosure.



FIG. 10 illustrates a block diagram of an example process for causing dynamic activity in an electronic word processing document, consistent with some embodiments of the present disclosure.



FIG. 11 illustrates an example of an electronic word processing document, consistent with some embodiments of the present disclosure.



FIG. 12 illustrates an example of a file external to an electronic word processing document, consistent with some embodiments of the present disclosure.



FIG. 13 illustrates an example of an interface enabling designation of document text as a variable data element, designation of a file as a source of replacement data, and permissions to be set on a variable data element, consistent with some embodiments of the present disclosure.



FIG. 14 illustrates an example of an electronic word processing document possessing variable data elements, consistent with some embodiments of the present disclosure.



FIG. 15A illustrates an example of replacement data present in a file external to the electronic word processing document corresponding to current data of a variable data element in the electronic word processing document, consistent with some embodiments of the present disclosure.



FIG. 15B illustrates an example of current data of a variable data element in an electronic word processing document being replaced by replacement data from an external file, consistent with some embodiments of the present disclosure.



FIG. 16A illustrates an example of a change to a variable data element in the electronic word processing document, consistent with some embodiments of the present disclosure.



FIG. 16B illustrates an example of a file external to the electronic word processing document being updated to reflect a change to a variable data element in the electronic word processing document, consistent with some embodiments of the present disclosure.



FIG. 17A illustrates an example of a variable data element being selected, consistent with some embodiments of the present disclosure.



FIG. 17B illustrates an example of an iframe, containing information from an external file, being presented in response to a selection of a variable data element, consistent with some embodiments of the present disclosure.



FIG. 18 illustrates a block diagram of an example process for automatically updating an electronic word processing document based on a change in a linked file and vice versa, consistent with some embodiments of the present disclosure.





DETAILED DESCRIPTION

Exemplary embodiments are described with reference to the accompanying drawings. The figures are not necessarily drawn to scale. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It should also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.


In the following description, various working examples are provided for illustrative purposes. However, is to be understood the present disclosure may be practiced without one or more of these details.


Throughout, this disclosure mentions “disclosed embodiments,” which refer to examples of inventive ideas, concepts, and/or manifestations described herein. Many related and unrelated embodiments are described throughout this disclosure. The fact that some “disclosed embodiments” are described as exhibiting a feature or characteristic does not mean that other disclosed embodiments necessarily share that feature or characteristic.


This disclosure presents various mechanisms for collaborative work systems. Such systems may involve software that enables multiple users to work collaboratively. By way of one example, workflow management software may enable various members of a team to cooperate via a common online platform. It is intended that one or more aspects of any mechanism may be combined with one or more aspect of any other mechanisms, and such combinations are within the scope of this disclosure.


This disclosure is constructed to provide a basic understanding of a few exemplary embodiments with the understanding that features of the exemplary embodiments may be combined with other disclosed features or may be incorporated into platforms or embodiments not described herein while still remaining within the scope of this disclosure. For convenience, and form of the word “embodiment” as used herein is intended to refer to a single embodiment or multiple embodiments of the disclosure.


Certain embodiments disclosed herein include devices, systems, and methods for collaborative work systems that may allow a user to interact with information in real time. To avoid repetition, the functionality of some embodiments is described herein solely in connection with a processor or at least one processor. It is to be understood that such exemplary descriptions of functionality applies equally to methods and computer readable media and constitutes a written description of systems, methods, and computer readable media. The underlying platform may allow a user to structure a systems, methods, or computer readable media in many ways using common building blocks, thereby permitting flexibility in constructing a product that suits desired needs. This may be accomplished through the use of boards. A board may be a table configured to contain items (e.g., individual items presented in horizontal rows) defining objects or entities that are managed in the platform (task, project, client, deal, etc.). Unless expressly noted otherwise, the terms “board” and “table” may be considered synonymous for purposes of this disclosure. In some embodiments, a board may contain information beyond which is displayed in a table. Boards may include sub-boards that may have a separate structure from a board. Sub-boards may be tables with sub-items that may be related to the items of a board. Columns intersecting with rows of items may together define cells in which data associated with each item may be maintained. Each column may have a heading or label defining an associated data type. When used herein in combination with a column, a row may be presented horizontally and a column vertically. However, in the broader generic sense as used herein, the term “row” may refer to one or more of a horizontal and/or a vertical presentation. A table or tablature as used herein, refers to data presented in horizontal and vertical rows, (e.g., horizontal rows and vertical columns) defining cells in which data is presented. Tablature may refer to any structure for presenting data in an organized manner, as previously discussed. such as cells presented in horizontal rows and vertical columns, vertical rows and horizontal columns, a tree data structure, a web chart, or any other structured representation, as explained throughout this disclosure. A cell may refer to a unit of information contained in the tablature defined by the structure of the tablature. For example, a cell may be defined as an intersection between a horizontal row with a vertical column in a tablature having rows and columns. A cell may also be defined as an intersection between a horizontal and a vertical row, or as an intersection between a horizontal and a vertical column. As a further example, a cell may be defined as a node on a web chart or a node on a tree data structure. As would be appreciated by a skilled artisan, however, the disclosed embodiments are not limited to any specific structure, but rather may be practiced in conjunction with any desired organizational arrangement. In addition, tablature may include any type of information, depending on intended use. When used in conjunction with a workflow management application, the tablature may include any information associated with one or more tasks, such as one or more status values, projects, countries, persons, teams, progress statuses, a combination thereof, or any other information related to a task.


While a table view may be one way to present and manage the data contained on a board, a table's or board's data may be presented in different ways. For example, in some embodiments, dashboards may be utilized to present or summarize data derived from one or more boards. A dashboard may be a non-table form of presenting data, using, for example, static or dynamic graphical representations. A dashboard may also include multiple non-table forms of presenting data. As discussed later in greater detail, such representations may include various forms of graphs or graphics. In some instances, dashboards (which may also be referred to more generically as “widgets”) may include tablature. Software links may interconnect one or more boards with one or more dashboards thereby enabling the dashboards to reflect data presented on the boards. This may allow, for example, data from multiple boards to be displayed and/or managed from a common location. These widgets may provide visualizations that allow a user to update data derived from one or more boards.


Boards (or the data associated with boards) may be stored in a local memory on a user device or may be stored in a local network repository. Boards may also be stored in a remote repository and may be accessed through a network. In some instances, permissions may be set to limit board access to the board's “owner” while in other embodiments a user's board may be accessed by other users through any of the networks described in this disclosure. When one user makes a change in a board, that change may be updated to the board stored in a memory or repository and may be pushed to the other user devices that access that same board. These changes may be made to cells, items, columns, boards, dashboard views, logical rules, or any other data associated with the boards. Similarly, when cells are tied together or are mirrored across multiple boards, a change in one board may cause a cascading change in the tied or mirrored boards or dashboards of the same or other owners.


Boards and widgets may be part of a platform that may enable users to interact with information in real time in collaborative work systems involving electronic collaborative word processing documents. Electronic collaborative word processing documents (and other variations of the term) as used herein are not limited to only digital files for word processing, but may include any other processing document such as presentation slides, tables, databases, graphics, sound files, video files or any other digital document or file. Electronic collaborative word processing documents may include any digital file that may provide for input, editing, formatting, display, and/or output of text, graphics, widgets, objects, tables, links, animations, dynamically updated elements, or any other data object that may be used in conjunction with the digital file. Any information stored on or displayed from an electronic collaborative word processing document may be organized into blocks. A block may include any organizational unit of information in a digital file, such as a single text character, word, sentence, paragraph, page, graphic, or any combination thereof. Blocks may include static or dynamic information, and may be linked to other sources of data for dynamic updates. Blocks may be automatically organized by the system, or may be manually selected by a user according to preference. In one embodiment, a user may select a segment of any information in an electronic word processing document and assign it as a particular block for input, editing, formatting, or any other further configuration.


An electronic collaborative word processing document may be stored in one or more repositories connected to a network accessible by one or more users through their computing devices. In one embodiment, one or more users may simultaneously edit an electronic collaborative word processing document. The one or more users may access the electronic collaborative word processing document through one or more user devices connected to a network. User access to an electronic collaborative word processing document may be managed through permission settings set by an author of the electronic collaborative word processing document. An electronic collaborative word processing document may include graphical user interface elements enabled to support the input, display, and management of multiple edits made by multiple users operating simultaneously within the same document.


Various embodiments are described herein with reference to a system, method, device, or computer readable medium. It is intended that the disclosure of one is a disclosure of all. For example, it is to be understood that disclosure of a computer readable medium described herein also constitutes a disclosure of methods implemented by the computer readable medium, and systems and devices for implementing those methods, via for example, at least one processor. It is to be understood that this form of disclosure is for ease of discussion only, and one or more aspects of one embodiment herein may be combined with one or more aspects of other embodiments herein, within the intended scope of this disclosure.


Embodiments described herein may refer to a non-transitory computer readable medium containing instructions that when executed by at least one processor, cause the at least one processor to perform a method. Non-transitory computer readable mediums may be any medium capable of storing data in any memory in a way that may be read by any computing device with a processor to carry out methods or any other instructions stored in the memory. The non-transitory computer readable medium may be implemented as hardware, firmware, software, or any combination thereof. Moreover, the software may preferably be implemented as an application program tangibly embodied on a program storage unit or computer readable medium consisting of parts, or of certain devices and/or a combination of devices. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine may be implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described in this disclosure may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such a computer or processor is explicitly shown. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit. Furthermore, a non-transitory computer readable medium may be any computer readable medium except for a transitory propagating signal.


The memory may include a Random Access Memory (RAM), a Read-Only Memory (ROM), a hard disk, an optical disk, a magnetic medium, a flash memory, other permanent, fixed, volatile or non-volatile memory, or any other mechanism capable of storing instructions. The memory may include one or more separate storage devices collocated or disbursed, capable of storing data structures, instructions, or any other data. The memory may further include a memory portion containing instructions for the processor to execute. The memory may also be used as a working scratch pad for the processors or as a temporary storage.


Some embodiments may involve at least one processor. A processor may be any physical device or group of devices having electric circuitry that performs a logic operation on input or inputs. For example, the at least one processor may include one or more integrated circuits (IC), including application-specific integrated circuit (ASIC), microchips, microcontrollers, microprocessors, all or part of a central processing unit (CPU), graphics processing unit (GPU), digital signal processor (DSP), field-programmable gate array (FPGA), server, virtual server, or other circuits suitable for executing instructions or performing logic operations. The instructions executed by at least one processor may, for example, be pre-loaded into a memory integrated with or embedded into the controller or may be stored in a separate memory.


In some embodiments, the at least one processor may include more than one processor. Each processor may have a similar construction, or the processors may be of differing constructions that are electrically connected or disconnected from each other. For example, the processors may be separate circuits or integrated in a single circuit. When more than one processor is used, the processors may be configured to operate independently or collaboratively. The processors may be coupled electrically, magnetically, optically, acoustically, mechanically or by other means that permit them to interact.


Consistent with the present disclosure, disclosed embodiments may involve a network. A network may constitute any type of physical or wireless computer networking arrangement used to exchange data. For example, a network may be the Internet, a private data network, a virtual private network using a public network, a Wi-Fi network, a LAN or WAN network, and/or other suitable connections that may enable information exchange among various components of the system. In some embodiments, a network may include one or more physical links used to exchange data, such as Ethernet, coaxial cables, twisted pair cables, fiber optics, or any other suitable physical medium for exchanging data. A network may also include a public switched telephone network (“PSTN”) and/or a wireless cellular network. A network may be a secured network or unsecured network. In other embodiments, one or more components of the system may communicate directly through a dedicated communication network. Direct communications may use any suitable technologies, including, for example, BLUETOOTH™, BLUETOOTH LE™ (BLE), Wi-Fi, near field communications (NFC), or other suitable communication methods that provide a medium for exchanging data and/or information between separate entities.


Certain embodiments disclosed herein may also include a computing device for generating features for work collaborative systems, the computing device may include processing circuitry communicatively connected to a network interface and to a memory, wherein the memory contains instructions that, when executed by the processing circuitry, configure the computing device to receive from a user device associated with a user account instruction to generate a new column of a single data type for a first data structure, wherein the first data structure may be a column oriented data structure, and store, based on the instructions, the new column within the column-oriented data structure repository, wherein the column-oriented data structure repository may be accessible and may be displayed as a display feature to the user and at least a second user account. The computing devices may be devices such as mobile devices, desktops, laptops, tablets, or any other devices capable of processing data. Such computing devices may include a display such as an LED display, augmented reality (AR), virtual reality (VR) display.


Certain embodiments disclosed herein may include a processor configured to perform methods that may include triggering an action in response to an input. The input may be from a user action or from a change of information contained in a user's table, in another table, across multiple tables, across multiple user devices, or from third-party applications. Triggering may be caused manually, such as through a user action, or may be caused automatically, such as through a logical rule, logical combination rule, or logical templates associated with a board. For example, a trigger may include an input of a data item that is recognized by at least one processor that brings about another action.


In some embodiments, the methods including triggering may cause an alteration of data and may also cause an alteration of display of data contained in a board or in memory. An alteration of data may include a recalculation of data, the addition of data, the subtraction of data, or a rearrangement of information. Further, triggering may also cause a communication to be sent to a user, other individuals, or groups of individuals. The communication may be a notification within the system or may be a notification outside of the system through a contact address such as by email, phone call, text message, video conferencing, or any other third-party communication application.


Some embodiments include one or more of automations, logical rules, logical sentence structures and logical (sentence structure) templates. While these terms are described herein in differing contexts, in a broadest sense, in each instance an automation may include a process that responds to a trigger or condition to produce an outcome; a logical rule may underly the automation in order to implement the automation via a set of instructions; a logical sentence structure is one way for a user to define an automation; and a logical template/logical sentence structure template may be a fill-in-the-blank tool used to construct a logical sentence structure. While all automations may have an underlying logical rule, all automations need not implement that rule through a logical sentence structure. Any other manner of defining a process that respond to a trigger or condition to produce an outcome may be used to construct an automation.


Other terms used throughout this disclosure in differing exemplary contexts may generally share the following common definitions.


In some embodiments, machine learning algorithms (also referred to as machine learning models or artificial intelligence in the present disclosure) may be trained using training examples, for example in the cases described below. Some non-limiting examples of such machine learning algorithms may include classification algorithms, data regressions algorithms, image segmentation algorithms, visual detection algorithms (such as object detectors, face detectors, person detectors, motion detectors, edge detectors, etc.), visual recognition algorithms (such as face recognition, person recognition, object recognition, etc.), speech recognition algorithms, mathematical embedding algorithms, natural language processing algorithms, support vector machines, random forests, nearest neighbors algorithms, deep learning algorithms, artificial neural network algorithms, convolutional neural network algorithms, recursive neural network algorithms, linear machine learning models, non-linear machine learning models, ensemble algorithms, and so forth. For example, a trained machine learning algorithm may comprise an inference model, such as a predictive model, a classification model, a regression model, a clustering model, a segmentation model, an artificial neural network (such as a deep neural network, a convolutional neural network, a recursive neural network, etc.), a random forest, a support vector machine, and so forth. In some examples, the training examples may include example inputs together with the desired outputs corresponding to the example inputs. Further, in some examples, training machine learning algorithms using the training examples may generate a trained machine learning algorithm, and the trained machine learning algorithm may be used to estimate outputs for inputs not included in the training examples. In some examples, engineers, scientists, processes and machines that train machine learning algorithms may further use validation examples and/or test examples. For example, validation examples and/or test examples may include example inputs together with the desired outputs corresponding to the example inputs, a trained machine learning algorithm and/or an intermediately trained machine learning algorithm may be used to estimate outputs for the example inputs of the validation examples and/or test examples, the estimated outputs may be compared to the corresponding desired outputs, and the trained machine learning algorithm and/or the intermediately trained machine learning algorithm may be evaluated based on a result of the comparison. In some examples, a machine learning algorithm may have parameters and hyper parameters, where the hyper parameters are set manually by a person or automatically by a process external to the machine learning algorithm (such as a hyper parameter search algorithm), and the parameters of the machine learning algorithm are set by the machine learning algorithm according to the training examples. In some implementations, the hyper-parameters are set according to the training examples and the validation examples, and the parameters are set according to the training examples and the selected hyper-parameters.



FIG. 1 is a block diagram of an exemplary computing device 100 for generating a column and/or row oriented data structure repository for data consistent with some embodiments. The computing device 100 may include processing circuitry 110, such as, for example, a central processing unit (CPU). In some embodiments, the processing circuitry 110 may include, or may be a component of, a larger processing unit implemented with one or more processors. The one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate array (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information. The processing circuitry such as processing circuitry 110 may be coupled via a bus 105 to a memory 120.


The memory 120 may further include a memory portion 122 that may contain instructions that when executed by the processing circuitry 110, may perform the method described in more detail herein. The memory 120 may be further used as a working scratch pad for the processing circuitry 110, a temporary storage, and others, as the case may be. The memory 120 may be a volatile memory such as, but not limited to, random access memory (RAM), or non-volatile memory (NVM), such as, but not limited to, flash memory. The processing circuitry 110 may be further connected to a network device 140, such as a network interface card, for providing connectivity between the computing device 100 and a network, such as a network 210, discussed in more detail with respect to FIG. 2 below. The processing circuitry 110 may be further coupled with a storage device 130. The storage device 130 may be used for the purpose of storing single data type column-oriented data structures, data elements associated with the data structures, or any other data structures. While illustrated in FIG. 1 as a single device, it is to be understood that storage device 130 may include multiple devices either collocated or distributed.


The processing circuitry 110 and/or the memory 120 may also include machine-readable media for storing software. “Software” as used herein refers broadly to any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the one or more processors, may cause the processing system to perform the various functions described in further detail herein.



FIG. 2 is a block diagram of computing architecture 200 that may be used in connection with various disclosed embodiments. The computing device 100, as described in connection with FIG. 1, may be coupled to network 210. The network 210 may enable communication between different elements that may be communicatively coupled with the computing device 100, as further described below. The network 210 may include the Internet, the world-wide-web (WWW), a local area network (LAN), a wide area network (WAN), a metro area network (MAN), and other networks capable of enabling communication between the elements of the computing architecture 200. In some disclosed embodiments, the computing device 100 may be a server deployed in a cloud computing environment.


One or more user devices 220-1 through user device 220-m, where ‘m’ in an integer equal to or greater than 1, referred to individually as user device 220 and collectively as user devices 220, may be communicatively coupled with the computing device 100 via the network 210. A user device 220 may be for example, a smart phone, a mobile phone, a laptop, a tablet computer, a wearable computing device, a personal computer (PC), a smart television and the like. A user device 220 may be configured to send to and receive from the computing device 100 data and/or metadata associated with a variety of elements associated with single data type column-oriented data structures, such as columns, rows, cells, schemas, and the like.


One or more data repositories 230-1 through data repository 230-n, where ‘n’ in an integer equal to or greater than 1, referred to individually as data repository 230 and collectively as data repository 230, may be communicatively coupled with the computing device 100 via the network 210, or embedded within the computing device 100. Each data repository 230 may be communicatively connected to the network 210 through one or more database management services (DBMS) 235-1 through DBMS 235-n. The data repository 230 may be for example, a storage device containing a database, a data warehouse, and the like, that may be used for storing data structures, data items, metadata, or any information, as further described below. In some embodiments, one or more of the repositories may be distributed over several physical storage devices, e.g., in a cloud-based computing environment. Any storage device may be a network accessible storage device, or a component of the computing device 100.



FIG. 3 is an exemplary embodiment of a presentation of an electronic collaborative word processing document 301 via an editing interface or editor 300. The editor 300 may include any user interface components 302 through 312 to assist with input or modification of information in an electronic collaborative word processing document 301. For example, editor 300 may include an indication of an entity 312, which may include at least one individual or group of individuals associated with an account for accessing the electronic collaborative word processing document. User interface components may provide the ability to format a title 302 of the electronic collaborative word processing document, select a view 304, perform a lookup for additional features 306, view an indication of other entities 308 accessing the electronic collaborative word processing document at a certain time (e.g., at the same time or at a recorded previous time), and configure permission access 310 to the electronic collaborative word processing document. The electronic collaborative word processing document 301 may include information that may be organized into blocks as previously discussed. For example, a block 320 may itself include one or more blocks of information. Each block may have similar or different configurations or formats according to a default or according to user preferences. For example, block 322 may be a “Title Block” configured to include text identifying a title of the document, and may also contain, embed, or otherwise link to metadata associated with the title. A block may be pre-configured to display information in a particular format (e.g., in bold font). Other blocks in the same electronic collaborative word processing document 301, such as compound block 320 or input block 324 may be configured differently from title block 322. As a user inputs information into a block, either via input block 324 or a previously entered block, the platform may provide an indication of the entity 318 responsible for inputting or altering the information. The entity responsible for inputting or altering the information in the electronic collaborative word processing document may include any entity accessing the document, such as an author of the document or any other collaborator who has permission to access the document.


This disclosure presents various mechanisms for dynamic work systems. Such systems may involve software that enables electronic word processing documents to include dynamic activity. By way of one example, software may enable various dynamic elements from a live application to be reflected in an electronic word processing document. It is intended that one or more aspects of any mechanism may be combined with one or more aspect of any other mechanisms, and such combinations are within the scope of this disclosure.


This disclosure is constructed to provide a basic understanding of a few exemplary embodiments with the understanding that features of the exemplary embodiments may be combined with other disclosed features or may be incorporated into platforms or embodiments not described herein while still remaining within the scope of this disclosure. For convenience, and form of the word “embodiment” as used herein is intended to refer to a single embodiment or multiple embodiments of the disclosure.


In electronic word processing documents, it may be beneficial to employ a myriad of actions for triggering edits to the document when one or more conditions are met. Ensuring that the information in an electronic word processing document is up-to-date when that information is related to dynamically changing applications external to the electronic word processing document can be daunting when the possible changes to the applications could be endless. Therefore, there may be a need for unconventional innovations for ensuring that data in an electronic word processing document is up-to-date and correct through efficient processing and storing methods.


Some disclosed embodiments may involve systems, methods, and computer-readable media for causing dynamic activity in an electronic word processing document. The systems and methods described herein may be implemented with the aid of at least one processor or non-transitory computer readable medium, such as a CPU, FPGA, ASIC, or any other processing structure(s) or storage medium, as described herein. Dynamic activity, as used herein, may include updating, syncing, changing, manipulating, or any other form of altering information associated with an electronic word processing document in response to an alteration of another source of data or any other trigger or threshold being met. Causing dynamic activity may include carrying out instructions to continuously or periodically update information in an electronic word processing document so that the dynamic activity may be updated in real time or in near-real time. For example, causing dynamic activity may include altering text, images, font size, or any other data present in the electronic word processing document in response to continuous or periodic lookups and detecting a threshold for carrying out an activity, as carried out by steps discussed in further detail below. Electronic word processing document (and other variations of the term) as used herein are not limited to only digital files for word processing, but may include any other processing document such as presentation slides, tables, databases, graphics, sound files, video files or any other digital document or file. Electronic word processing documents may include any digital file that may provide for input, editing, formatting, display, and/or output of text, graphics, widgets, objects, tables, links, animations, dynamically updated elements, or any other data object that may be used in conjunction with the digital file. Any information stored on or displayed from an electronic word processing document may be organized into blocks. A block may include any organizational unit of information in a digital file, such as a single text character, word, sentence, paragraph, page, graphic, or any combination thereof. Blocks may include static or dynamic information, and may be linked to other sources of data for dynamic updates. Blocks may be automatically organized by the system, or may be manually selected by a user according to preference. In one embodiment, a user may select a segment of any information in an electronic word processing document and assign it as a particular block for input, editing, formatting, or any other further configuration. An electronic word processing document may be stored in one or more repositories connected to a network accessible by one or more users through their computing devices.


Some disclosed embodiments may include accessing an electronic word processing document. Accessing an electronic word processing document may include retrieving the electronic word processing document from a storage medium, such as a local storage medium or a remote storage medium. A local storage medium may be maintained, for example, on a local computing device, on a local network, or on a resource such as a server within or connected to a local network. A remote storage medium may be maintained in the cloud, or at any other location other than a local network. In some embodiments, accessing the electronic word processing document may include retrieving the electronic word processing document from a web browser cache. Additionally or alternatively, accessing the electronic word processing document may include accessing a live data stream of the electronic word processing document from a remote source. In some embodiments, accessing the electronic word processing document may include logging into an account having a permission to access the document. For example, accessing the electronic word processing document may be achieved by interacting with an indication associated with the electronic word processing document, such as an icon or file name, which may cause the system to retrieve (e.g., from a storage medium) a particular electronic word processing document associated with the indication.


For example, as shown in FIG. 2, a user device 220-1 can send a request to access the electronic word processing document to the network 210. The request can then be communicated to the repository 230-1 where the document is stored via the database management system 235-1. The electronic word processing document can be retrieved from the repository 230-1 and transferred through the database management service 235-1 and network 210 for display on the user device 220-1.


By way of example, FIG. 4 illustrates an electronic word processing document 410, consistent with some embodiments of the present disclosure. As shown in the figure, an electronic word processing document 410 can include information regarding an itinerary created by a user of the electronic word processing document 410. For ease of discussion, the electronic word processing document 410 presented in the figure may be representative of displaying a user's itinerary on a calendar, but, as explained above, it is to be understood that the electronic word processing document can be any digital file.


Some disclosed embodiments may include presenting an interface enabling selection of a live application, outside an electronic word processing document, for embedding in the electronic word processing document. An application consistent with the present disclosure may include any set of instructions or commands for carrying out any number of actions or tasks in relation to a source of data or data object. A live application may be an application that continuously or periodically carries out its instructions. For example, a live application may include a packaged set of instructions for retrieving and displaying data or information such as the price of a stock, the weather for a certain location, flight information, or any other information that may be dynamic. As another example, a live application may include a packaged set of instructions for retrieving static or dynamic data from another electronic word processing document for display or manipulation, such as a graphical representation of a pie chart, status of a project, or any other form of data or metadata present in the other electronic word processing document. A live application outside the electronic word processing document may include a live application hosted by a third party platform independent from the electronic word processing document. For example, a live application outside of the electronic word processing document may include a flight tracking application, a weather application, or any other set of instructions or commands for continuously or periodically carrying out any number of actions or tasks in relation to a source of data or data object hosted by a third party platform independent of the platform hosting the electronic document (e.g., an electronic word processing application). Presenting an interface may include rendering a display of information with activatable elements that may enable interaction with the information through a computing device. An interface enabling selection of a live application may include any rendered display of information that may include options corresponding to different live applications with the same or different functionality such that any of the live applications may be selected through an interaction from a computing device associated with a user (e.g., through an activatable element such as a graphical button). For example, the interface may include a graphical user interface rendering a menu option of one or more live applications that may be depicted by indicators (e.g., graphical, alphanumeric, or a combination thereof) that may be configured to select the corresponding application in response to an interaction with a particular indicator, such as with a mouse click or a cursor hover. In response to a selection of a live application, a user may be enabled to upload electronic word processing documents, elect the data to be embedded, enter a website address along with the relevant data to be embedded, or carry out any other tasks via the interface. As another example, the interface may include a graphical user interface allowing the user to manually identify, via textual or any other sensory form (visual, auditory, or tactile) of input, a data source and/or data set for embedding. Embedding in an electronic word processing document may, in some embodiments, include inserting data or a link within an electronic word processing document. Such embedding may be visible at the user interface level or may occur at the code level. In some embodiments, embedding may involve generating a data structure, storing information in the data structure, and rendering a display of information in the data structure within an electronic word processing document at a particular location of the electronic word processing document or in association with the electronic word processing document, as discussed previously. A data structure consistent with the present disclosure may include any collection of data values and relationships among them. The data may be stored linearly, horizontally, hierarchically, relationally, non-relationally, uni-dimensionally, multidimensionally, operationally, in an ordered manner, in an unordered manner, in an object-oriented manner, in a centralized manner, in a decentralized manner, in a distributed manner, in a custom manner, or in any manner enabling data access. By way of non-limiting examples, data structures may include an array, an associative array, a linked list, a binary tree, a balanced tree, a heap, a stack, a queue, a set, a hash table, a record, a tagged union, ER model, and a graph. For example, a data structure may include an XML database, an RDBMS database, an SQL database or NoSQL alternatives for data storage/search such as, for example, MongoDB, Redis, Couchbase, Datastax Enterprise Graph, Elastic Search, Splunk, Solr, Cassandra, Amazon DynamoDB, Scylla, HBase, and Neo4J. A data structure may be a component of the disclosed system or a remote computing component (e.g., a cloud-based data structure). Data in the data structure may be stored in contiguous or non-contiguous memory. Moreover, a data structure, as used herein, does not require information to be co-located. It may be distributed across multiple servers, for example, that may be owned or operated by the same or different entities. Thus, the term “data structure” as used herein in the singular is inclusive of plural data structures.


A repository may store data such as an array, linked list, object, data field, chart, graph, graphical user interface, video, animation, iframe, HTML, element (or element in any other markup language), and/or any other representation of data conveying information from an application. In some embodiments, embedding in the electronic word processing application may include inserting lines of code (e.g., HTML data) into a file or other software instance representing the electronic word processing document. For example, HTML text may represent the electronic word processing document, and embedding the live application within the electronic word processing application may include inserting lines of code into the HTML text to cause the electronic word processing document to source data (e.g., for rendering within the embedded electronic non-word processing application), which may be content data for an associated data structure. In some embodiments, embedding the live application within the electronic word processing application may include inserting code associated with an API or software development toolkit (SDK) into the electronic word processing application and/or electronic word processing document.


For example, embedding an application in the electronic word processing document may occur when a user selects a position, portion, or region of the document (e.g., the first line of the document) and selects an application to be stored and operated from that position, portion, or region of the document. It should be understood that the user can define how the application is embedded relative to the document layout, the data present in the document, or relative to any other features of the document. For example, a user may embed the application to operate from a static position, such as the bottom right corner of a page of the document, or dynamically, such as in-line with the text of a paragraph so that when a position of the paragraph moves, so too does the embedded application. The system may render an options menu for presenting one or more applications for embedding into the electronic word processing document. The system may perform a lookup of available applications to embed (e.g., through a marketplace or through a local repository) and enable a user to select one or more applications for embedding into the electronic word processing document.



FIG. 5 illustrates an exemplary interface 510 enabling selection of a live application via indicator 512, outside the electronic word processing document, for embedding in the electronic word processing document. While not shown, a user may be presented with an interface displaying different applications that may be selected for embedding in an electronic word processing document. In FIG. 5, a user may be enabled to interact with indicator 512 to confirm a selection of the live application or to change or add a selection of another live application to embed in the electronic word processing document. The live application options may be third party applications hosted by platforms independent of the electronic word processing document. The live applications may be selected for embedding in the document. In FIG. 5, interface 510 may enable selection of a live application 512 by interacting with lookup interface 514 that may enable a user to manually enter text to identify a set of data or information located in a repository, or to upload a new set of information not already stored in a repository. However, it is understood that the selection of a live application is not limited to these embodiments and can be implemented in the any manner as discussed herein or in any manner that allows the user to select an application to act on any selected data for embedding in the word processing document.


Some disclosed embodiments may include embedding, in-line with text of an electronic word processing document, a live active icon representative of a live application. A live active icon as used herein may include symbol, emblem, sign, mark, or any other character graphical representation that may be displayed dynamically (e.g., displayed via animations or displayed according to updates of information). The selection of a live active icon may be automated using automation or logical rules based on the live application or may be selected manually by a user. Embedding a live active icon representative of a live application may include selecting a portion of an electronic word processing document for storing and rendering a graphical representation that may be rendered dynamically and correspond to information associated with a live application, consistent with the methods discussed previously above regarding embedding applications. A live active icon may be said to be representative of a live application in that the live active icon may include a rendering of information related to information in the live application in a reduced or substituted format, discussed in further detail below. Embedding a live active icon in-line with text of the electronic word processing document may include displaying a rendering of a live active icon in a portion of the document that is insertable between alphanumeric characters where characteristics of the live active icon are structured to be compatible with characteristics of the alphanumeric characters retrieved from a data structure in a repository. The data and information stored in the data structure may include the font size, format, color, or any other characteristics of the selected alphanumerical characters. In some embodiments, embedding in-line with text may include sizing a live active icon to correspond to an in-line text font size. Sizing the live active icon to correspond to an in-line text font size may include retrieving and identifying the font size of the alphanumeric characters surrounding the live active icon placement in a data structure and manipulating the rendered display of the live active icon to be equivalent or similar to the size of the alphanumeric characters surrounding the live active icon placement location. Manipulating the rendered display of the live active icon may include altering the size, orientation, imagery, or any other characteristic of the live active icon such that the resulting size of the icon is equivalent or similar to the size of the alphanumeric characters surrounding the live active icon's placement. The sizing of the live active icon may be manually specified by the user, automated based on logical rules, or based on any other manner of defining a process that respond to a condition to produce an outcome. For example, a logical rule could be established to size a display of the live active icon to the maximum in-line text font size that is present in the document as a whole or the maximum in-line font text that is present in the line of text that the live active icon resides in. As a further example, the system may be configured to resolve conflicting sizing requirements in a single embedding. For example, if the font sizes surrounding the placement of the live active icon retrieved from the data structure are not equivalent, the system may size the display of the live active icon to be equivalent to the preceding font size, equivalent to the subsequent font size, an average of both font sizes, or size the live active icon based any other automation, logical rules, or any other defining process that respond to a condition to produce an outcome set by the user or determined by the system.


Some embodiments may include one or more of automations, logical rules, logical sentence structures and logical (sentence structure) templates. While these terms are described herein in differing contexts, in a broadest sense, in each instance an automation may include a process that responds to a trigger or condition to produce an outcome; a logical rule may underly the automation in order to implement the automation via a set of instructions; a logical sentence structure is one way for a user to define an automation; and a logical template/logical sentence structure template may be a fill-in-the-blank tool used to construct a logical sentence structure. While all automations may have an underlying logical rule, all automations need not implement that rule through a logical sentence structure. Any other manner of defining a process that respond to a trigger or condition to produce an outcome may be used to construct an automation.


As illustrated in FIG. 6, the electronic word processing document 610 may contain live active icons 612, 614, 616, 618, and 620, represented by alphanumeric text or graphical representations that are representative of a respective live application, embedded in-line with the text of the electronic word processing document. For ease of discussion, the live active icons present in the figure are representative of the live applications of information related to a flight status and gate information 612 and 620 and the weather 614, 616, and 618 for the corresponding days on the calendar, but it is to be understood that the live active icons can be representative of any data that is selected to be included in the live applications. As illustrated, the weather-based live active icon 614 may correspond to a weather-based, live application and may be depicted with a graphical representation of a sun to represent corresponding information in the live application: the forecasted weather of a sunny day in Los Angeles, CA on Mar. 3, 2022. Similarly, as illustrated, the weather-based live active icon 616 may be depicted as a graphical representation of a cloud with rain to represent the corresponding information in the weather-based live application that the weather is forecasted to be a rainy day in Vail, CO on Mar. 7, 2022. As illustrated, the weather-based live active icon 618 may be depicted with graphical representation of a cloud with rain and a lightning bolt to represent the corresponding information in the weather-based live application that is the forecasted weather of thunderstorms in Vail, CO on Mar. 8, 2022. These weather-based active icons may be live in that as the underlying information changes, so too does the graphical representation. For example, weather-based icon 616 may be rendered with a cloud and rain drops because the live application retrieves forecast information outside the electronic word processing document that rain is expected on March 7 in Vail, CO However, once the forecast information is updated and the application changes its forecast to sunny on March 7 in Vail, CO, the weather-based icon 616 may be re-rendered with a graphical indication of a sun to reflect the underlying forecast information that has been changed. The underlying data from the live application represented by the live active icons may be determined manually by the user, via a mouse by clicking or hovering on certain data or by any other sensory form (visual, auditory, or tactile) of input, or the data may be determined by the system using logical rules, automation, machine learning, or artificial intelligence. For example, as disclosed above, a user could use the interface 510 to identify the live application. While not disclosed in FIG. 5, a user may also access the live application and elect certain data from the live application to be represented by the corresponding live active icon. For example, as seen in FIG. 6, the data represented by the live active icon 614 may be selected by a user accessing the weather tracker live application and selecting the particular data of the expected weather in Los Angeles, CA on Mar. 3, 2022 to be represented by the live active icon. As a further example, once the live application is elected, the system may perform contextual detection on the position of the live active icon in electronic word processing document to determine the relevant data from the live application to be represented in the live active icon. For example, in FIG. 6, once a user selects the live application to be a weather tracker application and selects the position of the live active icon 614 to be placed in the entry for March 3rd, the system may perform contextual detection to analyze the surrounding data in the March 3rd location to determine that the live active icon (and the associated live application) is being applied to represent the particular data from the weather in Los Angeles, CA at 7:00 PM. Once the data from the live application represented by the live active icons is selected, the data may be recorded and stored in a data structure, stored in the metadata of the live active icon, or stored by another other method that allows for the data from the live application to be recorded.


By way of example, FIG. 5 depicts an interface 510 that may allow a user to choose an icon 516 to represent the live application 512 that may be selected for embedding. As represented by the icon selection area 518 in FIG. 5, the interface 510 may allow for an icon 516 to be chosen from a dropdown menu or manually uploaded by the user. FIG. 5 shows an exemplary depiction for these options for selecting an icon, but it is to be understood that the live active icons can be selected in any way that allows for a character to be representative of the live application selected to be embedded.


Some disclosed embodiments may include presenting, in a first viewing mode, the live active icon wherein during the first viewing mode, the live active icon is displayed embedded in-line with the text, and the live active icon dynamically changes based on occurrences outside the electronic word processing document. Presenting, in a first viewing mode, the live active icon, as used herein, may include rendering a display of the live active icon in a first format, such as in the format of an indicator (e.g., graphical, alphanumeric, or a combination thereof) that is representative of the selected data in the live active icon's corresponding live application. Displaying the live active icon embedded in-line with the text may include rendering a presentation of the live active icon in between alphanumeric characters. Dynamically changing, as used herein, may include re-rendering or replacing the icon, changing the icon's color, shape, size, background, orientation, the icon itself, or any other edit or modification in a continuous or periodic basis based on retrieved updates, such as an occurrence outside an electronic word processing document. The live active icons may dynamically change as manually specified by the user, automatically based on logical rules, or based on any other manner of defining a process that respond to a condition to produce an outcome. An occurrence outside the electronic word processing document may include any event that meets a defined threshold according to a condition. For example, an occurrence outside the electronic word processing document may include a flight status changing from “On-time” to “Delayed” because this may meet a defined threshold of any status change. As a result of this flight status change, a system may retrieve this update in a live application across a network, which may cause the display of an associated live active icon to change to reflect the flight status change. Automated dynamic changing may include evaluating if an occurrence has occurred in a live application outside of the electronic word processing document and upon that evaluation, retrieving a display alteration (e.g., a first viewing mode) to apply to the icon from a data structure. A data structure consistent with the present disclosure may include any collection of data values and relationships among them. The data structure may be maintained on one or more of a server, in local memory, or any other repository suitable for storing any of the data that may be associated with a plurality of rules and any other data objects. For example, a live active icon may dynamically change based on the system's evaluation of an occurrence internal or external to the live application, which may then be used to lookup a corresponding icon manipulation in a data structure. Evaluating occurrences outside of the electronic word processing document may include using an application programming interface, scraping text from a data source and comparing that data to the data, for the corresponding live active icon, stored in a data structure and calculating if a change in value has occurred, or any other method of interacting with data outside of the electronic word processing document to analyze the data present at that time. Evaluating occurrences outside of the electronic word processing document may also include establishing triggers for evaluating the data source, such as user defined events, a user defined frequency of evaluation, or any other manner of defining a trigger including user definitions, automation, logical rules, or machine learning.



FIG. 6, FIG. 7A, FIG. 7B, FIG. 8A, FIG. 8B, and FIG. 9A depict exemplary live active icons in a first viewing mode in-line with the text in the electronic word processing document. As illustrated in FIG. 7A, and FIG. 7B, a live active icon 712A depicting the live application's forecasted weather in Vail, CO on Mar. 7, 2022 can dynamically change from depicting a live active icon 712A depicting the live application's rainy forecast to a live active icon 712B depicting the live application's updated sunny forecast in response to the change in forecast of the live application. FIG. 7A and FIG. 7B depict a live active icon dynamically changing due to the occurrence of an updated weather forecast, but it should be understood that a live active icon as described herein can dynamically change based on any evaluation of data within or outside the electronic word processing document.


Displays of live active icons may also be chosen as a group, family, or any other organization of live active icons. For example, in selecting the live application of weather in Vail, CO, as shown in FIG. 6, a user may select a family of weather based live active icons to represent the live application and its underlying data. In this example, the live active icon could dynamically change to any other live active icons within the family including clouds with rain, clouds with lightning, the sun, or any other icon depicting a weather phenomenon.


In some embodiments, an interface may be configured to enable selection of abridged information for presentation in a first viewing mode. Abridged information for presentation, as used herein, may include any reduction of information (e.g., less than all of the information) that may be displayed in a display format for viewing. Enabling selection of abridged information for presentation may include presenting all or some of the information contained in a live application, receiving an input to instruct the processor to select an amount of information less than the original presentation of all or some of the information, and displaying the selected amount of information as the abridged information. For example, a live application may act on underlying data regarding flight status with a particular airline retrieved from the particular airline's website. The system may be enabled to receive a selection of information in the live application to select only the flight status itself (e.g., on-time, delayed, canceled) and not the rest of the information in the live application such as the flight number, departure date, and any other information. As a result, the system may present the flight status in a graphical manner as the live active icon that may be embedded in an electronic document. Abridged information may also include data retrieved from the running of an automation, logical rules, machine learning, or artificial intelligence. For example, the abridged information to be presented in the first viewing mode could be based on contextual detection. The system may analyze the text surrounding the position of the live active icon, the data present in the live application, or any other data available to the system to determine which information from the live application to include in the first viewing mode for the live active icon. As another example, the system may use contextual detection to determine the type of information present in the live application (e.g., a flight tracking application or a weather tracking application) to lookup that type of data in a data structure to find the corresponding abridged information to select to include in the first viewing mode. Similar to the example above regarding using the flight status as the abridged information, instead of the system receiving a selection of the information to determine which information to use as the abridged information, the system may automatically detect that the flight status and gate information should be used as the abridged information based on semantic analysis of the particular airline's website providing the underlying information and data. Additionally, the determination of the abridged information to include in the first viewing mode could be performed using automation, logical rules, machine learning, or any other manner of analyzing a data source to determine the data is relevant to include in the first viewing mode.


By way of example, FIG. 6 depicts live active icons 612 and 620 in the first viewing mode containing abridged information of the flights that may be dynamically changed on the corresponding days of the itinerary in the electronic word processing document 610 including the flight status and departure gate. The abridged information present in the display of the live active icons 612 and 620 may be selected manually by the user or automatically by the system using contextual detection, automation, logical rules, machine learning, or any other manner of analyzing a data source (e.g., the airline's website) to determine the relevant data to include in the first viewing mode.


In some embodiments, a live active icon may include an animation that plays in-line with the text during the first viewing mode. An animation that plays, as used herein, may include any visual display of the live active icon in a manner that visually changes or otherwise re-renders to display different information. For example, an icon may visually change in color to show a change in temperature over time. In another example, the icon may be visually depicted to represent movement, such as a graphical representation of an airplane with a moving propeller prop (e.g., via a GIF, video clip, or any other sequence of visual representations). In another example, the live active icon may rotate between different modes of display such that the live active icon displays different amounts of information in each mode. For example, a live active icon may alternate between a graphical display of an airplane, which may then switch to a display of alphanumerics including flight status or other flight information. The method of manipulating the live active icon to show changes or edits may include a sequence and may include implementing a user defined manipulation or a manipulation based on logical rules or any other manner of defining an output for a corresponding input. For example, a user may elect to animate the selected live active icon to be embedded, to which the system would retrieve the corresponding animation for the selected live active icon from a data structure. Playing in-line with the text during the first viewing mode, as used herein, may include using animations that do not alter the position or placement of the live active icon with respect to the surrounding alphanumeric characters when the live active icon is displayed in the first viewing mode as discussed previously above.


By way of example, FIG. 5 depicts an interface 510 that allows a user to select indicator 520 to elect the live active icon to be animated. By way of example, in FIG. 8A a live active icon 812A is depicted representing the weather based live application's forecast for Vail, CO on Mar. 8, 2022 as having thunderstorms where the live active icon 812A is rendered with a cloud with rain and a single lightning bolts. The animation of that live active icon 812B is illustrated in FIG. 8B where the live active icon 812B is re-rendered as a cloud without rain but with three lightning bolts. The animation of the live active icon in FIG. 8A and FIG. 8B is shown in a two-step sequence, but it should be understood that an animation may manipulate a live active icon is any number of sequences and may manipulate the live active icon in any manner.


Some disclosed embodiments may include receiving a selection of a live active icon. Selecting the live active icon, as used herein, may include the use of a keyboard or a pointing device (e.g., a mouse or a trackball) by which the user can provide input (e.g., a click, gesture, cursor hover, or any other interaction) to an associated computing device to indicate an intent to elect a particular live active icon that may be displayed on an associated display of the computing device. Other kinds of devices can be used to provide for interaction with a user to facilitate the selection as well; for example, sensory interaction provided by the user can be any form of sensory interaction (e.g., visual interaction, auditory interaction, or tactile interaction).


By way of example, FIG. 9A shows the input of a selection of a live active icon 920A can be performed using a pointing device 922A.


Some disclosed embodiments may include, in response to a selection, presenting a second viewing mode, an expended view of a live application. Presenting a second viewing mode may include rendering a visual representation that may be rendered dynamically and correspond to information associated with a live application as well as using an auditory device, tactile device, or any other form of sensory feedback. The information included in the second viewing mode may include more, less, or the same data present in the first viewing mode (e.g., the rendering of the live active icon). An expanded view of the live application may include a display of additional information related to the live application or any other form of sensory feedback including additional information relative to the live application, which may be rendered on a larger area of a display than that of the first viewing mode. The information to be included in the second view as used herein may include live application data manually identified by the user or data identified based on logical rules, automation, machine learning, or any other manner of identifying relevant data to include in the expanded view. For example, the system may use contextual detection to determine the type of data present in the live application and use that classification to find the corresponding data to be presented in a second viewing mode based on a relationship stored in a data structure. In some embodiments, the at least one processor is configured to present the second viewing mode in an iframe. In some embodiments, the live active icon in a first viewing mode may have an appearance corresponding to imagery present in the expanded view. The appearance of a live active icon may include the rendered display of a live active icon to the user, the animation or sequence of a live active icon, the data or metadata of a live active icon, or any other sensory feedback associated with the live active icon. Imagery present in the expanded view may include images, alphanumerics, text, data, metadata, video, sound, or any other sensory feedback that is present within the display of information relative to the live application. An appearance corresponding to the imagery present in the expanded view may include dynamically changing the appearance of a live active icon to possess similar data, text, color, alphanumerics, images, or any other sensory feedback present in the expanded view. For example, the processor may detect the information present in an expanded view (e.g., full information from a live application) and look up a rule for a corresponding appearance stored in a data structure for the live active icon (e.g., abridged information). The corresponding appearance may correlate with the full information in an expanded view. For example, a live application in an expanded view may include a visual display of multiple depictions of racecars racing around a track. In a corresponding live active icon (e.g., the first viewing mode), the live active icon may contain a visual rendering of a single racecar (similar imagery) or a checkered flag (different but related imagery) to correspond to the imagery in the expanded view. Further, the system may use contextual analysis based on the classification of a live application (e.g., determining a live application possesses information related to an airplane flight) to determine which data present in the expanded view to include in the live active icon (e.g., the flight's status and gate information). Additionally, the appearance of a live active icon may change form from an image to text, text to animation, audible output to another form of sensory feedback, or from any first appearance to a second appearance. For example, a live active icon may initially be depicted as a sun to reflect imagery present in an expanded view consisting of a sunny weather forecast. If the system's connection with the live application were to be interrupted, the exemplary expanded view may consist of an “error” message, and as such, the live active icon may dynamically change from the sun to a text-based live active icon depicting “ERROR.”


By way of example, in FIG. 9B, in response to the selection of active icon 920A with cursor 922A of FIG. 9A, a second viewing mode 924B may be presented that includes additional information 926B from the corresponding live application. In other embodiments, the system may have stored the underlying data to display additional information 926B as all, or less than all but more than the information included in the live active icon, of the information that is normally presented in the live application (e.g., the second viewing mode 924B). As a result of embedding a live active icon (e.g., live active icon 920A of FIG. 9A) that is associated with the live application, the system may present abridged information from the live application for the live active icon 920A. For example, second viewing mode 924B presents all, or less than all but more than the information included in the live active icon, information including additional information 926B in FIG. 9B. Corresponding live active icon 920A of FIG. 9A may present abridged information to display only “DELAYED” and “Gate B7,” which represent part of all of the available underlying information associated with the live application (as presented in the second viewing mode 924B of FIG. 9B).


Further, in FIG. 9A, the appearance of an exemplary live active icon 920A in its first viewing mode contains text corresponding to the displayed additional information 926B in the expanded view 924B of FIG. 9B. The text included in the appearance of the exemplary live active icon 920A of FIG. 9A may be set by the user, retrieved from a data structure, determined by rules, automation, machine learning, artificial intelligence, or any other method of analyzing data and formulating an output.


In some embodiments, an interface may include a permission tool for enabling selective access restriction to at least one of a live active icon or an expanded view. Enabling selective access restriction may include altering a presentation of at least a portion of information in an electronic word processing document, altering a user's interaction with a portion of information in the electronic word processing document, or any other method of restricting access to a portion of information in the electronic word processing document to prohibit or otherwise reduce a user's ability to view or edit a particular portion of information. An expanded view may include a presentation of information that is more substantive than the presentation of information in a live active icon, consistent with the discussion above regarding the second viewing mode for presenting information of the live application. For example, enabling selective access restriction may include enabling selectable portions of the live active icons or their expanded views in the electronic word processing document to be altered visually (e.g., redacted, blurred, or another other visual manipulation) or changing the settings of the electronic word processing document such that only authorized users can interact with the selected portions or the entirety of the information displayed in either the live active icon or in the expanded view. A permission tool as used herein may include graphical user interface elements or any other manner enabling the support of the management of the input, display, and access of users attempting to interact with or access information associated with a live active icon or the expanded view (e.g., the live application).


By way of example, FIG. 5 depicts an interface 510 allowing a user to control access, via permission indicator 522, by entering control settings into permission menu indicator 524 which can allow the user to select from a dropdown menu or manually enter names of parties that are allowed to access the live application or extended view. However, it should be understood that the manner of enabling the support of the management of the input, display, and access of users attempting to interact with a live active icon or the expanded view should not be limited to these examples.


Some disclosed embodiments may include receiving a collapse instruction. A collapse instruction, as used herein, may include a command signal indicating an intent to reduce or obscure the presentation of information. Receiving a collapse instruction may include receiving the command signal by the use of a keyboard or a pointing device (e.g., a mouse or a trackball) by which the user can provide input to a computing device, or through the lack of an instruction to default to the collapse instruction (e.g., a time out threshold is reached for inactivity). Other kinds of devices may include providing for a collapse instruction as well; for example, a sensory instruction provided by the user (e.g., visual instruction, auditory instruction, or tactile instruction). Further, the collapse instruction may be transmitted based on a corresponding rule, retrieved from a data structure, dependent on the data present in the second viewing mode or based on a permission tool parameter (e.g., allowing the user, as a part of the permission tool, to set a maximum duration that other users may view the second viewing mode). Some disclosed embodiments may include, in response to the collapse instruction, reverting from the second viewing mode to the first viewing mode. Reverting from the second viewing mode to the first viewing mode, as used herein, may include closing or otherwise obscuring the second viewing mode or any other manner of transitioning from the seconding viewing mode to the first viewing mode.


By way of example, as illustrated in FIG. 9A and FIG. 9B, reverting from the second viewing mode 924B of FIG. 9B would result in the live active icon returning to its first viewing mode 920A as shown by FIG. 9A. This may be a result of a user, through an associated computing device, sending an instruction to the system to revert to the first viewing mode via a collapse instruction. This collapse instruction may be received when the user's cursor 922A selects an activatable element that sends the collapse instruction to the system, or when the user's cursor 922A stops moving in the display over a period of time (that may be a default or defined), in which the system may also default to interpreting this as a collapse instruction. The collapse instruction may also be received when the user's cursor 922A selects a different live active icon or when the user's cursor 922A selects any part of the electronic word processing document external to the second viewing mode 924B.



FIG. 10 illustrates a block diagram of an example process 1010 for causing dynamic activity in an electronic word processing document. While the block diagram may be described below in connection with certain implementation embodiments presented in other figures, those implementations are provided for illustrative purposes only, and are not intended to serve as a limitation on the block diagram. In some embodiments, the process 1010 may be performed by at least one processor (e.g., the processing circuitry 110 in FIG. 1) of a computing device (e.g., the computing device 100 in FIGS. 1 and 2) to perform operations or functions described herein and may be described hereinafter with reference to FIGS. 4 to 9B by way of example. In some embodiments, some aspects of the process 1010 may be implemented as software (e.g., program codes or instructions) that are stored in a memory (e.g., the memory portion 122 in FIG. 1) or a non-transitory computer-readable medium. In some embodiments, some aspects of the process 1010 may be implemented as hardware (e.g., a specific-purpose circuit). In some embodiments, the process 1010 may be implemented as a combination of software and hardware.



FIG. 10 includes process blocks 1012 to 1026. At block 1012, a processing means (e.g., any type of processor described herein or that otherwise performs actions on data) may access an electronic word processing document, consistent with some embodiments of the present disclosure.


At block 1014, the processing means may present an interface enabling selection of a live application. The live application may be outside the electronic word processing document and the selection may be made for embedding the live application in the electronic word processing document, as previously discussed in the disclosure above.


At block 1016, the processing means may embed a live active icon representative of the live application. The live active icon may be embedded in-line with text of the electronic word processing document, consistent with the discussion above.


At block 1018, the processing means may present the live active icon in a first viewing mode where the live active icon dynamically changes based on outside occurrences. The live active icon in the first viewing mode may be embedded in-line with text of the electronic word processing document, consistent with the discussion above.


At block 1020, the processing means may receive a selection of the live active icon, as previously discussed in the disclosure above.


At block 1022, the processing means may present a second viewing mode of the live application. The second viewing mode may be an expended view of the live application, consistent with the discussion above.


At block 1024, the processing means may receive a collapse instruction, as previously discussed in the disclosure above.


At block 1026, the processing means may revert from the second viewing mode to the first viewing mode, as previously discussed in the disclosure above.


This disclosure presents various mechanisms for dynamic work systems. Such systems may involve operations that enable electronic word processing documents to include dynamic activity. By way of one example, operations may enable various dynamic elements from an external file to be reflected in an electronic word processing document. It is intended that one or more aspects of any mechanism may be combined with one or more aspect of any other mechanisms, and such combinations are within the scope of this disclosure.


This disclosure is constructed to provide a basic understanding of a few exemplary embodiments with the understanding that features of the exemplary embodiments may be combined with other disclosed features or may be incorporated into platforms or embodiments not described herein while still remaining within the scope of this disclosure. For convenience, and form of the word “embodiment” as used herein is intended to refer to a single embodiment or multiple embodiments of the disclosure.


In electronic word processing documents, it may be beneficial to employ a myriad of actions for triggering edits to the document when one or more conditions are met. Ensuring that the information in an electronic word processing document is up-to-date when that information is related to dynamically changing files external to the electronic word processing document can be daunting when the possible changes to the applications could be endless. Therefore, there may be a need for unconventional innovations for ensuring that data in an electronic word processing document is up-to-date and correct through efficient processing and storing methods.


Some disclosed embodiments may involve systems, methods, and computer-readable media for automatically updating an electronic word processing document based on a change in a linked file and vice versa. The systems and methods described herein may be implemented with the aid of at least one processor or non-transitory computer readable medium, such as a CPU, FPGA, ASIC, or any other processing structure(s) or storage medium, as described herein. Electronic word processing documents (and other variations of the term) as used herein are not limited to only digital files for word processing, but may include any other processing document such as presentation slides, tables, databases, graphics, sound files, video files or any other digital document or file. Electronic word processing documents may include any digital file that may provide for input, editing, formatting, display, and/or output of text, graphics, widgets, objects, tables, links, animations, dynamically updated elements, or any other data object that may be used in conjunction with the digital file. Any information stored on or displayed from an electronic word processing document may be organized into blocks. A block may include any organizational unit of information in a digital file, such as a single text character, word, sentence, paragraph, page, graphic, or any combination thereof. Blocks may include static or dynamic information, and may be linked to other sources of data for dynamic updates. Blocks may be automatically organized by the system, or may be manually selected by a user according to preference. In one embodiment, a user may select a segment of any information in an electronic word processing document and assign it as a particular block for input, editing, formatting, or any other further configuration. An electronic word processing document may be stored in one or more repositories connected to a network accessible by one or more users through their computing devices.


Automatically updating an electronic word processing document may include carrying out instructions to sync, change, manipulate, or any other form of altering information associated with an electronic word processing document. Such automatic updating may occur in response to a change in a linked file, or vice versa (e.g., causing an automatic update to the linked file in response to a change in the electronic word processing document), or any other trigger or threshold being met. It should be understood that all embodiments and disclosures discussed and disclosed herein do not have to operate in a certain order (e.g., variable data element to corresponding data in the external file, data in external file to corresponding variable data element). As such, all changes, updates, edits, or other manipulations should be understood to occur in any manner, sequence, direction, and do not possess a structured order. Updating may be initiated by the user or by the system based on a trigger or threshold being met. A linked file may include any electronic document that may be associated with or otherwise have an established relationship with the electronic word processing document. A linked file may also include another electronic word processing document, files or data external to the electronic word processing software or application, or any other type of file or set of data (e.g., presentations, audio files, video files, tables, data sets). A change in a linked file may include any update, alteration, manipulation, or any other form of variation to the data present in a linked file in its entirety or to a portion, region, block, or section of the data present in a linked file including metadata. Detecting a change in a linked file may involve receiving an API call (or other type of software call) regarding a change to the entirety or a portion, region, block, or section of a linked file. Detecting a change in a linked file may also include the system storing the data present in a linked file in a data structure and periodically accessing the linked file to evaluate if the data present in the linked file has changed, such as scraping HTML, text of the file, when compared to the data from the linked file stored in the data structure. The periodic evaluation of the data present in the linked file may be established by a user at any time interval (e.g., every millisecond, second, minute, hour, day, or any other increment) or may be set established by the system using an automation, logical rules, machine learning, artificial intelligence, or any other manner of establishing a time interval based or event dependent based evaluation of data present in a linked file.


By way of example, FIG. 11 illustrates an electronic word processing document 1110, consistent with some embodiments of the present disclosure. As shown in the figure, an electronic word processing document 1110 can include information regarding a schedule created by a user of the electronic word processing document 1110. For ease of discussion, the electronic word processing document 1110 presented in the figure may be representative of displaying a new hire orientation schedule created by the user that is to be distributed to the listed speakers and new hires, but, as explained above, it is to be understood that the electronic word processing document can be any digital file.


Some embodiments may include one or more of automations, logical rules, logical sentence structures and logical (sentence structure) templates. While these terms are described herein in differing contexts, in a broadest sense, in each instance an automation may include a process that responds to a trigger or condition to produce an outcome; a logical rule may underly the automation in order to implement the automation via a set of instructions; a logical sentence structure is one way for a user to define an automation; and a logical template/logical sentence structure template may be a fill-in-the-blank tool used to construct a logical sentence structure. While all automations may have an underlying logical rule, all automations need not implement that rule through a logical sentence structure. Any other manner of defining a process that respond to a trigger or condition to produce an outcome may be used to construct an automation.


Some disclosed embodiments may include accessing an electronic word processing document. Accessing an electronic word processing document may include retrieving the electronic word processing document from a storage medium, such as a local storage medium or a remote storage medium. A local storage medium may be maintained, for example, on a local computing device, on a local network, or on a resource such as a server within or connected to a local network. A remote storage medium may be maintained in the cloud, or at any other location other than a local network. In some embodiments, accessing the electronic word processing document may include retrieving the electronic word processing document from a web browser cache. Additionally or alternatively, accessing the electronic word processing document may include accessing a live data stream of the electronic word processing document from a remote source. In some embodiments, accessing the electronic word processing document may include logging into an account having a permission to access the document. For example, accessing the electronic word processing document may be achieved by interacting with an indication associated with the electronic word processing document, such as an icon or file name, which may cause the system to retrieve (e.g., from a storage medium) a particular electronic word processing document associated with the indication.


For example, as shown in FIG. 2, a user device 220-1 may send a request to access the electronic word processing document to the network 210. The request can then be communicated to the repository 230-1 where the document is stored via the database management system 235-1. The electronic word processing document can be retrieved from the repository 230-1 and transferred through the database management service 235-1 and network 210 for display on the user device 220-1.


Some disclosed embodiments may include identifying in an electronic word processing document a variable data element, wherein the variable data element may include current data presented in the electronic word processing document and a link to a file external to the electronic word processing document. A variable data element may include any text, image, alphanumeric, video file, audio file, or any other information present in an electronic word processing document that may be subject to automatic updates such that the information in the variable data element may be considered to be dynamic information. Identifying a variable data element in an electronic word processing document may include analyzing the information present in the electronic word processing document to automatically detect if any information possesses a link to an external file. Identifying a variable data element in an electronic word processing document may also include the system accessing a data structure to identify the current data presented in the electronic word processing document that is stored in the data structure to correspond to a variable data element with its corresponding link(s) to external file(s). In additional embodiments, identifying a variable data element may include a manual selection of static information in an electronic document to designate that the selection is a variable data element that may be reconfigured to include dynamic information (e.g., by linking the selected information to an external file). Current data presented in the electronic word processing document, as used herein, may include any information (e.g., image, text, alphanumeric, video file, audio file, or any other data) present in the electronic word processing document that may correspond to a variable data element. A variable data element may include a link to a file external to the electronic word processing document. A link to a file external to the electronic word processing document may include a functioning hyperlink that may be activated or triggered to access and retrieve data in a separate electronic document from the electronic word processing document within the system or external to the system. Activating the link may cause the processor to retrieve information in an external file from a storage medium, such as a local storage medium or a remote storage medium. For example, the link may include a text hyperlink, image hyperlink, bookmark hyperlink, or any other type of link that may allow the system to retrieve the external file from a separate storage device or a third party platform independent from the electronic word processing document. A file external to the electronic word processing document may include a file hosted by a third party platform independent from the electronic word processing document, a file separate from the electronic word processing document, or any other collection of data outside of the electronic word processing document (e.g., audio files, video files, data files, etc.). In some embodiments, an external file may include an additional electronic word processing document. In some embodiments, the current data may include text of the electronic word processing document and the link may include metadata associated with the text. As discussed above, the variable data element may include current data presented in the electronic word processing document and a link to a file external to the electronic word processing document. The variable data element may include current data in the form of text (e.g., the text “DEAL PENDING”) that may be configured to be dynamic. The link may include metadata associated with the text in a manner that reflects the semantic meaning of the text in the current data. For example, when the variable data element includes the text “DEAL PENDING” in a first electronic document, the link between the variable data element to the external file (e.g., a second electronic document) may be an activatable hyperlink with tagged information indicative of the status of the variable data element as pending or incomplete. In this way, the tagged information in the form of metadata may be retrieved and presented on a display, or may be transmitted across a network to the external file (e.g., the second electronic document) so that the status of the variable data element in the first electronic document may be transmitted without the need for an additional accessing or retrieving of information step of data in the first electronic document to decrease processing times and decrease memory usage.


By way of example, FIG. 12 illustrates a file 1210 external to an electronic word processing document 1110 of FIG. 11, consistent with some embodiments of the present disclosure. As shown in FIG. 12, the file 1210 external to an electronic word processing document 1110 may be another electronic word processing document and can include information regarding a schedule created by a user. For ease of discussion, the file 1210, external to the electronic word processing document 1110, as illustrated in FIG. 11, presented in the figure may be representative of displaying a new hire orientation schedule created by the user, but, as explained above, it is to be understood that the file external to an electronic word processing document can be any collection of data outside of the electronic word processing document. For ease of discussion, in the particular example depicted by FIG. 12, the external file 1210 is a new hire orientation schedule prepared by the Human Resources department of a company that is organizing the new hire orientation. As such, for this discussion, only employees in the Human Resources department may have access to the planning document. As discussed in more detail below, a variable data element may be designated from current data in an electronic document such as electronic document 1410 of FIG. 14. The current data in the electronic document may be in the form of textual information such as variable data elements 1412, 1414, and 1416.


In some embodiments, the at least one processor may be further configured to present an interface in an electronic word processing document for enabling designation of document text as a variable data element and for enabling designation of a file as a source of replacement data. Presenting an interface in the electronic word processing document may include rendering a display of information with activatable elements that may enable interaction with the information through a computing device. It should be understood that the rendering of this display may occur within the electronic word processing document, outside of the word processing document, in an iframe, or in any other manner of rendering the display to the user. An interface enabling designation of a variable data element may include any rendered display of information that may include options corresponding to different data present in the electronic word processing document with the same or different functionality such that any of the data present in the electronic word processing document may be selected through an interaction from a computing device associated with a user (e.g., through an activatable element such as a graphical button). Designation of a variable data element may include the use of an interface allowing the user to manually identify, via interaction with a computing device associated with the user, textual input, or any other sensory form (visual, auditory, or tactile) of input, data or sets of data, including document text (e.g., alphanumerics, graphics, or a combination thereof), present in the electronic word processing document to be a variable data element. Designation of a variable data element may also include the processor implementing logical rules, automations, machine learning, or artificial intelligence (e.g., semantic analysis) to determine and designate information in an electronic document as a variable data element. For example, an interface may allow a user to designate document text present in an electronic word processing document as a variable data element by using an interface allowing the user to select the document text through an interaction from a computing device (e.g., a mouse, keyboard, touchscreen, or any other device) associated with a user. An interface enabling designation of a file as a source of the replacement data may include any rendered display of information that may include options corresponding to different files with the same or different functionality such that any of the files may be selected through an interaction from a computing device associated with a user (e.g., through an activatable element such as a graphical button). Designation of a file as a source of the replacement data may include allowing the user to manually identify and assign, via textual or any other sensory form (visual, auditory, or tactile) of input, an external file using an interface that allows the user to upload the identification information of the file (e.g., a web address, a file location, or any other address or file path). The user may also designate a file as a source of replacement data by manually entering, via textual or any other sensory form (visual, auditory, or tactile) of input, the identification information of the file in-line with the text or other data contained in the electronic word processing document. A source of replacement data, as used herein, may include any electronic file containing data or information (e.g., text, images, data, alphanumerics, video files, audio files, or any other data in the external file) that the user or system selects to correspond to or is otherwise linked or associated with the current data (e.g., document text) in the electronic word processing document represented by a variable data element such that if there is a change in the source replacement data in the external file, the current data of the corresponding variable data element in the electronic word processing document will change to match or reflect a change in the replacement data. For example, the user may utilize an interface to select document text present in the electronic word processing document to be designated as current data for a variable data element and use the interface to manually enter the file location of the external file and identify the replacement data in that file corresponding to the selected variable data element. As another example, the system may allow the user to identify the relevant file(s) and replacement data and store the data, and replacement data, of the relevant file(s) in a data structure. The system may then perform contextual analysis, or any form of automation, machine learning, semantic analysis, or artificial intelligence, on the current data present in the electronic word processing document to suggest, recommend, or identify data present in the electronic word processing document to be designated as current data for a variable data element linked to one or more of the replacement data in the relevant files identified by the user. The system may store the variable data element, the link(s) to the corresponding external file(s), and the replacement data in those files in a data structure.



FIG. 13 illustrates an exemplary interface 1310 enabling designation of current data in the electronic word processing document 1110 of FIG. 11 as a variable data element via activatable element indicator 1312, consistent with some embodiments of the present disclosure. In FIG. 13, a user may be enabled to interact with indicator 1312 to confirm a selection of current data or to change or add a selection of another set of current data to designate as a variable data element. This may involve selecting a location in the electronic document to select data, or may involve a manual interaction with the current data in the electronic document (e.g., highlighting textual information) to make the selection. In FIG. 13, interface 1310 may enable designation of current data as a variable data element 1312 by interacting with lookup interface 1314 that may enable a user to manually enter text to identify current data located in the electronic word processing document 1110 of FIG. 11 or enable a user to browse the electronic word processing document 1110 of FIG. 11 and manually interact with the document to select current data as a variable data element. While not shown in this figure, it should be understood that the lookup interface 1314 may also feature a drop-down menu that allows the user to view all or filter by types of data present in the electronic word processing document 1110 of FIG. 11 for designation as a variable data element. For example, a user may interact with the lookup interface 1314 to view a rendered menu of all image files (e.g., JPG, PNG, etc.), retrieved from a data structure storing all data present in the document, present in the electronic word processing document 1110 of FIG. 11 and select an image from the menu to designate as a variable data element. In FIG. 13, an exemplary interface may also enable designation of a file via activatable element indicator 1316 as a source of replacement data, consistent with some embodiments of the present disclosure. A user may interact with lookup interface 1318 that may enable a user to manually enter identification information of a file (e.g., web address, file location, etc.) or enable a user to upload an external file. While not shown in this figure, it should be understood that the lookup interface 1318 may feature a drop-down menu allowing the user to designate recent files, or any other classification of files, as the source of the replacement data. Further, while not shown in this figure, it should be understood that the interface 1310 may allow a user to access the identified external file to identify the replacement data in the external file (e.g., specific data, a specific cell, a region of a document, the document in its entirety, etc.).



FIG. 14 illustrates an exemplary electronic word processing document 1410 containing current data that has been designated as variable data elements 1412, 1414, and 1416, consistent with some embodiments of the present disclosure. For ease of discussion, the text that has been designated as variable data elements 1412, 1414, and 1416 is displayed in bold and italics. However, it should be understood that a variable data element can be displayed in any manner distinguishing the data of the variable data element or in any manner not distinguishing the variable data element data from other data. For example, a data that has been designated as a variable data element may be displayed with a small icon next to the data, may change color once designated, may change font style, may change size, or may be displayed with any other distinguishing feature or without distinguishing features.


Some disclosed embodiments may include accessing an external file identified in a link. Accessing an external file identified by a link may include retrieving data through any electrical medium such as one or more signals, instructions, operations, functions, databases, memories, hard drives, private data networks, virtual private networks, Wi-Fi networks, LAN or WAN networks, Ethernet cables, coaxial cables, twisted pair cables, fiber optics, public switched telephone networks, wireless cellular networks, BLUETOOTH™, BLUETOOTH LE™ (BLE), Wi-Fi, near field communications (NFC), and/or any other suitable communication method that provides a medium for exchanging data. Accessing an external file may also involve constructing an API call, establishing a connection with a source of the external file (e.g., using an API or other application interface), authenticating a recipient of application data, transmitting an API call, receiving application data (e.g., dynamic data), and/or any other electronic operation that facilitates use of information associated with the external file. The link may be associated with a location of an external file in a repository such that the processor may access and retrieve the data associated with the external file quickly by activating and interpreting the information associated with the link.


Some disclosed embodiments may include pulling, from an external file, first replacement data corresponding to the current data. Pulling, from an external file, replacement data corresponding to the current data, as used herein, may include copying, duplicating, reproducing, extracting, or any other form of transferring the value of the data (e.g., information such as text) designated as the first replacement data in the external file corresponding to the current data in the electronic word processing document. For example, the system may access the external file and copy the image, text, audio file, video file, alphanumerics, or any other character or data that has been designated as the replacement data, as described throughout. The replacement data may then be retrieved from the external file for further processing or for transmission to the electronic word processing document so that the processor may re-render a display of the current data with the replacement data.


Some disclosed embodiments may include replacing current data in an electronic word processing document with first replacement data. Replacing current data in an electronic word processing document, as used herein, may include overriding, substituting, editing, making note of, re-rendering a display of information, or any other form of changing the current data in an electronic word processing document to reflect a change in the first replacement data. However, it should be understood that replacing current data in an electronic word processing document with a first replacement data does not require the current data, after replacing, to be identical to the replacement data. For example, if the settings of the source of the replacement data, an external file, allow the value of the replacement data to extend to five significant figures and the settings of the electronic word processing document only allowed data to extend to three significant figures, replacing the current data with the replacement data may result in the replaced current data and the replacement data to not be equivalent.


By way of example, FIG. 15A illustrates a file 1510A external to an electronic word processing document 1110 of FIG. 11 displaying an updated version of the external file 1210 in FIG. 12. The text 1512A represents an updated entry for the assigned speaker for the welcome speech scheduled on Jan. 2, 2022. For ease of discussion, similar to the external file 1210 in FIG. 12, this file 1510A may be internal to the Human Resources department of a company such that only employees of that department can access this document and serve as a source of replacement data. For example, electronic document 1410 has current data “Michelle Jones, CEO” 1412 in the form of text that has been designated as a variable data element. This variable data element 1412 may be linked to an external file serving as a source of replacement data as shown in FIG. 15A. FIG. 15A shows an example of a change that has been made to the speaker from Michelle Jones, CEO 1412 of FIG. 14 (e.g., the “current data”) to Randall James, CTO 1512A (e.g., the “replacement data”) of FIG. 15A. As a result of this change in the source of replacement data in the speaker on Jan. 2, 2022, the system may update the variable data element 1412 of FIG. 14 to reflect the updated speaker to be Randall James, CTO, reflected in updated variable data element 1512B of electronic document 1510B. FIG. 15B illustrates an electronic word processing document 1510B containing variable data elements 1512B, 1514B, and 1516B. FIG. 15B illustrates the replacing of the former document text of variable data element 1412 of FIG. 14, displayed as Michelle Jones, CEO, with the corresponding value of the variable data element's 1512B replacement data 1512A as depicted in FIG. 15A. The document text of variable data element 1512B of FIG. 15B, now matches the replacement data 1512A of FIG. 15A, displayed as Randall James, CTO.


Some embodiments may include identifying a change to a variable data element in an electronic word processing document. A change to a variable data element in the electronic word processing document may include any editing, manipulating, updating, altering (e.g., addition, subtraction, rearrangement, or a combination thereof), re-sizing a display, or any other form of variation to the variable data element. For example, editing the text of a text-based variable data element or changing the percentage represented in a pie chart of an image-based variable data element may constitute a change to the variable data element. Identifying a change to a variable data element may include the processor comparing the value of the data of the variable data element to the value to the prior current data stored in a data structure for the corresponding variable data element or use any other method of evaluating the value of the current data of a variable data element. The processor may initiate a comparison after detecting a user's interaction with the document resulting in an edit of the document, such as a user editing the text of a text-based variable data element, highlighting a portion of a variable data element and deleting it, or any other user interaction with the document resulting in an edit or manipulation of a variable data element. Further, the system may evaluate the value of data corresponding to a variable data element upon trigger events, such as when the document is opened, when the document is saved, after a certain amount of time has passed, or any other event that may trigger an evaluation of the data corresponding to the variable data element.


By way of example, FIG. 16A illustrates an electronic word processing document 1610A including variable data elements 1612A, 1614A, and 1616A, consistent with some embodiments of the present disclosure. As illustrated in FIG. 16A, variable data element 1614A (document text of Jan. 4, 2022) and variable data element 1616A (document text of Jan. 3, 2022) have changed from their former document text values of the variable data element 1514B with a document text of Jan. 3, 2022, and variable data element 1516B, with a document text of Jan. 4, 2022, as illustrated in FIG. 15B. For ease of discussion, the new values variable data elements 1614A and 1616A present in the electronic word processing document 1610A could have been performed manually by an entity with access to the electronic word processing document 1610A. For example, Sam Miller, Benefits Coordinator in this example, may have had a scheduling conflict on Jan. 4, 2022, and thus edited the schedule on an associated computing device to switch timeslots with Carl Howard such that Carl would present on January 4th and Sam could present on January 3rd in a source of replacement data. In response to these edits, the processor may receive the input as replacement data and transmit the information to the variable data elements and cause the display to re-render the variable data elements with the updated information input by Sam. Further to the example, but not present in the figure, when the system detects Sam interacting the document text of the variable data elements, the system may evaluate the data of the variable data elements Sam interacts with and compare the data to the corresponding variable data element stored in a data structure to determine if Sam edited the variable data elements as shown by variable data elements 1616A and 1614A.


In some embodiments, at least one processor may be configured to transmit a message to a designated entity when a variable data element is changed. A designated entity may include any assigned name, phone number, email address, employee identification number, or any other identifying information to deliver or transmit a message or notification to. Establishing a designated entity may be accomplished manually by the user via an interface allowing a user to manually enter entity information or may be accomplished automatically by the system via logical rules, automation, machine learning, or artificial intelligence. For example, a logical rule may be established such that if a change to a variable data element is identified, a message is sent to the author of the document, the entity that designated the data as a variable data element, or any other entity involved or interested in the document. Transmitting a message to a designated entity when a variable data element is changed may include sending a message via email, SMS, MIMS, push-notifications, phone call, or any other manner of communicating information relating the change that occurred in the variable data element. For example, if text representing the name of the presenter for a presentation was designated to be a variable data element and the name of presenter was changed, the user may have designated the entities to receive a message to be employees with the names matching that of the previously listed presenter and the newly listed presenter. In another example, the system may use logical rules to determine the designated entities. Further to this example, if text representing a time frame for a series of presentations is changed, a logical rule may designate the entities to receive a message to be all listed presenters or only the presenters whose time slots were changed. In addition to this example, the user or system may establish a threshold of change that must be met to transmit a message to a designated entity. For example, if text representing a stock price was designated to be a variable data element, a user may only be interested if the stock price changed to be above or below a certain threshold, as such the user may establish a threshold such that a message only be transmitted if the lower or upper threshold is crossed. The message may be transmitted in response to an established threshold is met, such as when the displayed information or any data associated with the variable data element (e.g., metadata) is updated.



FIG. 13 illustrates an exemplary interface 1310 allowing the user to designate an entity, via activatable element indicator 1320, to receive a message if any edits are made to the variable data element 1312, consistent with some embodiments of the present disclosure. In FIG. 13, interface 1310 may enable designation of entities to be notified of a change in a variable data element 1312 by interacting with lookup interface 1322 that may enable a user to manually choose a designated entity from a drop-down list such as an employee list or manually enter contact information such as a phone number or email address. While not pictured in the figure, upon the detection of a change to a variable data element, the system may enact logical rules, automation, machine learning, or artificial intelligence to determine an interested party in relation to the change. It should be understood that while the interface 1310 illustrated at FIG. 13 includes the ability to designate entities to be notified upon identification of a change to a variable data element in the same interface 1310 allowing designation of data as a variable data element, these designations do not have to be included in the same rendered interface. It is understood that the transmission of a message and designation of an entity to receive the message may be done in any manner as discussed herein or any manner allowing an entity to be designated to receive a message.


In some embodiments, at least one processor may be configured to display an interface for enabling permissions to be set on a variable data element and to thereby restrict modifications thereto. Displaying an interface for enabling permissions to be set on a variable data element may include rendering a display of information with activatable elements that may enable interaction with the information through a computing device. Permissions to be set on a variable data element may include a parameter that may control the ability of a user, user account, device, system, or combination thereof to access a variable data element, view a variable data element, use a function associated with a variable data element, edit a variable data element, delete a variable data element, move a variable data element, re-size a variable data element, influence a variable data element, or perform any other operation relative to a variable data element. Enabling permissions to be set on a variable data element and to thereby restrict modifications thereto may include controlling the ability of a user, user account, device, system, or combination thereof to prevent alterations, changes, edits, or any other modification or limitation to the data corresponding to a variable data element. This may involve sending instructions to the processor to place a memory lock on the data stored in the repository associated with the variable data element until an entity accessing the data associated with the variable data element is determined by the processor to be an authorized editor. Restricting modifications may include reducing the ability to alter (e.g., may alter a color, but not the text) or completely prohibiting any alterations to a variable data element. Permission settings for a particular variable data element in a document may be independent from the permission settings for other variable data elements located in the same document. For example, a first variable data element may have restrictive permission settings that enable only the author of the document to edit the first variable data element while a second variable data element may have public permission settings that enable any user to edit the second variable data element. As a result, an author of the document may edit both the first variable data element and the second variable data element while a second user (e.g., not an author of the document) would be prevented from making any edits or alterations to the first variable data element and would only be able to do so for the second variable data element.



FIG. 13 illustrates an exemplary interface 1310 allowing the user to enable permissions to be set on a variable data element and to thereby restrict modifications thereto, consistent with some embodiments of the present disclosure. In FIG. 13, interface 1310 may enable designation different levels of access via activatable element indicator 1326 and lookup interface 1328. Lookup interface 1328 may allow the user to access a drop down menu containing different levels of permission (e.g., “view only” or “redact data”). Further, interface 1310 may enable the user to designate the users to which the various level of access may apply via activatable element indicator 1330. Lookup interface 1332 may allow the user to manually enter a user's name to correspond to the level of access identified via indicator 1226. Lookup interface 1332 may also enable the user to designate the users to which the level of access applies by allowing the user to select the users from a list, such as an employee list. However, it is understood that the display of an interface for enabling permissions to be set on a variable data element may be displayed in any manner as discussed herein or any manner allowing the user to enable permissions.


Some embodiments may include, upon identification of a change, accessing an external file via a link. Accessing an external file via a link may include retrieving the electronic word processing document from a storage medium, such as a local storage medium or a remote storage medium, following activation of a text hyperlink, image hyperlink, bookmark hyperlink, or any other type of link allowing the system to identify a repository and retrieve the file from a separate storage device or a third party platform independent from the electronic word processing document. In some embodiments, accessing the external file via a link may include retrieving the file from a web browser cache. Additionally or alternatively, accessing the external file may include accessing a live data stream of the external file from a remote source. In some embodiments, accessing the external file may include logging into an account having a permission to access the document. For example, accessing the external file may be achieved by interacting with an indication associated with the external file, such as an icon or file name, which may cause the system to retrieve (e.g., from a storage medium) a particular external file associated with the indication.


Some embodiments may include updating an external file to reflect a change to a variable data element in the electronic word processing document. Updating an external file to reflect a change to a variable data element may include syncing, changing, modifying, editing, manipulating, or any other form of altering data associated with the variable data element in the external file in response to a change to a variable data element. The external file reflecting a change to a variable data element may include updating the data in the external file corresponding to the data or information associated with the variable data element in the electronic word processing document to be equivalent to the change to the variable data element, to be similar to the change to the variable data element, to manipulate the data by a similar magnitude or process as the variable data element, or any other edit to reflect the change to the variable data element. For example, the variable data element present in an electronic word processing document may be text-based data identifying the amount of money a company has raised at a fundraiser and is linked to an external accounting file. If on the final day of the fundraiser, the president of the non-profit receives a donation in person that puts the amount of donations collected over the company's goal, the president may edit the variable data element to reflect the new total and change the font color to green. Following this example, the data of the external accounting file corresponding to the variable data element in the electronic word processing document may be updated to reflect the change (e.g., adding “Goal Reached” to the external file) and thus, represent the new total in a green font or otherwise reflecting an indication of the information reflected in the variable data element.


By way of example, FIG. 16B illustrates an exemplary external file 1610B including text-based data 1612B, 1614B, and 1616B corresponding to variable data elements 1612A, 1614A, and 1616A in an electronic word processing document 1610A of FIG. 16A, consistent with some embodiments of the present disclosure. As shown in FIG. 16B, the external file 1610B has been updated to reflect the changes to variable data elements 1614A and 1616A in FIG. 16A such that the data corresponding to variable data element 1614A has changed from Jan. 3, 2022 to Jan. 4, 2022 and the data corresponding to variable data element 1616A has changed from Jan. 4, 2022 to Jan. 3, 2022.


In some embodiments, at least one processor may be configured to receive a selection of a variable data element and to present, in an iframe, information from an external file. Receiving a selection of a variable data element, as used herein, may include the use of a keyboard or a pointing device (e.g., a mouse or a trackball) by which the user can provide input (e.g., a click, gesture, cursor hover, or any other interaction) to an associated computing device to indicate an intent to elect a particular variable data element that may be displayed on an associated display of the computing device. Other kinds of devices can be used to provide for interaction with a user to facilitate the selection as well; for example, sensory interaction provided by the user can be any form of sensory interaction (e.g., visual interaction, auditory interaction, or tactile interaction).


By way of example, FIG. 17A shows the input of a selection of a variable data element 1712A in an electronic word processing document 1710A which can be carried out using a cursor 1718A associated with a device (e.g., touchpad, touchscreen, mouse, or any other interface device), consistent with some embodiments of the present disclosure.


Presenting, in an iframe, information from an external file may include rendering display of an iframe or a similar window including any data present or otherwise stored in an external file. The information from the external file included in the iframe may include the entirety of the external file, the replacement data in the external file, or any other data present in the external file and selected by the user or system to be included in the iframe. For example, the system may use logical rules, automation, machine learning, or artificial intelligence to determine the information from the external file to include in the iframe based on contextual analysis of the data corresponding to the variable data element. As an additional example, the information in the iframe may include the past values of the replacement data, retrieved from a data structure that stores the value of the replacement data each time the system receives an API call (other type of software call) that the replacement data has changed or the system detects a change in the replacement data, to show to change over time in the value of the replacement data in the external file. For example, a user may select a variable data element corresponding to the inventory for a particular product via a mouse click and, in response, the system may render a display of an iframe including information related to the inventory of a particular item, retrieved from the external file, such as the price of the item, the next estimated restock date, and the history of sales for that item.


By way of example, in FIG. 17B, in response to the selection of variable data element 1712A with cursor 1718A in FIG. 17A, an iframe 1712B may be presented to display data from the external file 1714B and its associated information (e.g., textual, graphical, or a combination thereof), consistent with some embodiments of the present disclosure. For example, the information 1714B from the external file may include additional information not typically displayed in the electronic word processing document 1710B such as the text-based data representing Randall James' talking points 1716B or metadata that is stored in the electronic word processing document 1710B.



FIG. 18 illustrates a block diagram of an example process 1810 automatically updating an electronic word processing document based on a change in a linked file and vice versa. While the block diagram may be described below in connection with certain implementation embodiments presented in other figures, those implementations are provided for illustrative purposes only, and are not intended to serve as a limitation on the block diagram. In some embodiments, the process 1810 may be performed by at least one processor (e.g., the processing circuitry 110 in FIG. 1) of a computing device (e.g., the computing device 100 in FIGS. 1 and 2) to perform operations or functions described herein and may be described hereinafter with reference to FIGS. 11 to 17B by way of example. In some embodiments, some aspects of the process 1810 may be implemented as software (e.g., program codes or instructions) that are stored in a memory (e.g., the memory portion 122 in FIG. 1) or a non-transitory computer-readable medium. In some embodiments, some aspects of the process 1810 may be implemented as hardware (e.g., a specific-purpose circuit). In some embodiments, the process 1810 may be implemented as a combination of software and hardware.



FIG. 18 includes process blocks 1812 to 1826. At block 1812, a processing means (e.g., any type of processor described herein or that otherwise performs actions on data) may access an electronic word processing document, consistent with some embodiments of the present disclosure.


At block 1814, the processing means may identify a variable data element. The variable data element may include current data presented in the electronic word processing document and a link to a file external to the electronic word processing document, as discussed above.


At block 1816, the processing means may access an external file identified in the link, as previously discussed in the disclosure above.


At block 1818, the processing means may pull, from the external file, first replacement data corresponding to the current data, as previously discussed above.


At block 1820, the processing means may replace the current data in the electronic word processing document with the first replacement data, as previously discussed above.


At block 1822, the processing means may identify a change to the variable data element present, as previously discussed above.


At block 1824, the processing means may, upon identification of the change, access the external file via the link, as previously discussed above.


At block 1826, the processing means may update the external file to reflect the change to the variable data element, as previously discussed above.


Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. The materials, methods, and examples provided herein are illustrative only and not intended to be limiting.


Implementation of the method and system of the present disclosure may involve performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present disclosure, several selected steps may be implemented by hardware (HW) or by software (SW) on any operating system of any firmware, or by a combination thereof. For example, as hardware, selected steps of the disclosure could be implemented as a chip or a circuit. As software or algorithm, selected steps of the disclosure could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the disclosure could be described as being performed by a data processor, such as a computing device for executing a plurality of instructions.


As used herein, the terms “machine-readable medium” “computer-readable medium” refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.


Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.


Although the present disclosure is described with regard to a “computing device”, a “computer”, or “mobile device”, it should be noted that optionally any device featuring a data processor and the ability to execute one or more instructions may be described as a computing device, including but not limited to any type of personal computer (PC), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, a smart watch or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a “network” or a “computer network”.


To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (a LED (light-emitting diode), or OLED (organic LED), or LCD (liquid crystal display) monitor/screen) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.


The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.


It should be appreciated that the above described methods and apparatus may be varied in many ways, including omitting or adding steps, changing the order of steps and the type of devices used. It should be appreciated that different features may be combined in different ways. In particular, not all the features shown above in a particular embodiment or implementation are necessary in every embodiment or implementation of the invention. Further combinations of the above features and implementations are also considered to be within the scope of some embodiments or implementations of the invention.


While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the implementations. It should be understood that they have been presented by way of example only, not limitation, and various changes in form and details may be made. Any portion of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The implementations described herein can include various combinations and/or sub-combinations of the functions, components and/or features of the different implementations described.


Disclosed embodiments may include any one of the following bullet-pointed features alone or in combination with one or more other bullet-pointed features, whether implemented as a method, by at least one processor, and/or stored as executable instructions on non-transitory computer-readable media:

    • accessing an electronic word processing document;
    • presenting an interface enabling selection of a live application, outside the electronic word processing document, for embedding in the electronic word processing document;
    • embedding, in-line with text of the electronic word processing document, a live active icon representative of the live application;
    • presenting, in a first viewing mode, the live active icon;
    • wherein during the first viewing mode, the live active icon is displayed embedded in-line with the text, and the live active icon dynamically changes based on occurrences outside the electronic word processing document;
    • receiving a selection of the live active icon;
    • in response to the selection, presenting in a second viewing mode, an expanded view of the live application;
    • receiving a collapse instruction;
    • in response to the collapse instruction, reverting from the second viewing mode to the first viewing mode;
    • embedding, in-line with text by sizing the live active icon to correspond to an in-line text font size;
    • wherein in the first viewing mode the live active icon has an appearance corresponding to imagery present in the expanded view;
    • presenting the second viewing mode in an iframe;
    • wherein the interface is configured to enable selection of abridged information for presentation in the first viewing mode;
    • wherein the interface includes a permission tool for enabling selective access restriction to at least one of the live active icon or the expanded view;
    • wherein the live active icon includes an animation that plays in-line with the text during the first viewing mode;
    • accessing the electronic word processing document;
    • identifying in the electronic word processing document a variable data element;
    • wherein the variable data element includes current data presented in the electronic word processing document and a link to a file external to the electronic word processing document;
    • accessing the external file identified in the link;
    • pulling, from the external file, first replacement data corresponding to the current data;
    • replacing the current data in the electronic word processing document with the first replacement data;
    • identifying a change to the variable data element in the electronic word processing document;
    • upon identification of the change, accessing the external file via the link;
    • updating the external file to reflect the change to the variable data element in the electronic word processing document;
    • wherein the current data includes text of the electronic word processing document and the link includes metadata associated with the text;
    • presenting an interface in the electronic word processing document for enabling designation of document text as the variable data element and for enabling designation of a file as a source of the replacement data;
    • displaying an interface for enabling permissions to be set on the variable data element and to thereby restrict modifications thereto;
    • wherein the external file is an additional electronic word processing document;
    • transmitting a message to a designated entity when the variable data element is changed; and
    • receiving a selection of the variable data element and to present in an iframe information from the external file.


Systems and methods disclosed herein involve unconventional improvements over conventional approaches. Descriptions of the disclosed embodiments are not exhaustive and are not limited to the precise forms or embodiments disclosed. Modifications and adaptations of the embodiments will be apparent from consideration of the specification and practice of the disclosed embodiments. Additionally, the disclosed embodiments are not limited to the examples discussed herein.


The foregoing description has been presented for purposes of illustration. It is not exhaustive and is not limited to the precise forms or embodiments disclosed. Modifications and adaptations of the embodiments will be apparent from consideration of the specification and practice of the disclosed embodiments. For example, the described implementations include hardware and software, but systems and methods consistent with the present disclosure may be implemented as hardware alone.


It is appreciated that the above described embodiments can be implemented by hardware, or software (program codes), or a combination of hardware and software. If implemented by software, it can be stored in the above-described computer-readable media. The software, when executed by the processor can perform the disclosed methods. The computing units and other functional units described in the present disclosure can be implemented by hardware, or software, or a combination of hardware and software. One of ordinary skill in the art will also understand that multiple ones of the above described modules/units can be combined as one module or unit, and each of the above described modules/units can be further divided into a plurality of sub-modules or sub-units.


The block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer hardware or software products according to various example embodiments of the present disclosure. In this regard, each block in a flowchart or block diagram may represent a module, segment, or portion of code, which includes one or more executable instructions for implementing the specified logical functions. It should be understood that in some alternative implementations, functions indicated in a block may occur out of order noted in the figures. For example, two blocks shown in succession may be executed or implemented substantially concurrently, or two blocks may sometimes be executed in reverse order, depending upon the functionality involved. Some blocks may also be omitted. It should also be understood that each block of the block diagrams, and combination of the blocks, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or by combinations of special purpose hardware and computer instructions.


In the foregoing specification, embodiments have been described with reference to numerous specific details that can vary from implementation to implementation. Certain adaptations and modifications of the described embodiments can be made. Other embodiments can be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as example only, with a true scope and spirit of the invention being indicated by the following claims. It is also intended that the sequence of steps shown in figures are only for illustrative purposes and are not intended to be limited to any particular sequence of steps. As such, those skilled in the art can appreciate that these steps can be performed in a different order while implementing the same method.


It will be appreciated that the embodiments of the present disclosure are not limited to the exact construction that has been described above and illustrated in the accompanying drawings, and that various modifications and changes may be made without departing from the scope thereof.


Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed embodiments disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosed embodiments being indicated by the following claims.


Computer programs based on the written description and methods of this specification are within the skill of a software developer. The various programs or program modules can be created using a variety of programming techniques. One or more of such software sections or modules can be integrated into a computer system, non-transitory computer readable media, or existing software.


Moreover, while illustrative embodiments have been described herein, the scope includes any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations or alterations based on the present disclosure. The elements in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application. These examples are to be construed as non-exclusive. Further, the steps of the disclosed methods can be modified in any manner, including by reordering steps or inserting or deleting steps. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.

Claims
  • 1. A system for causing dynamic activity in an electronic word processing document, the system comprising: at least one processor configured to:access the electronic word processing document;present an interface enabling selection of a live application, outside the electronic word processing document, for embedding in the electronic word processing document;embed, in-line with text of the electronic word processing document, a live active icon representative of the live application;present, in a first viewing mode, the live active icon wherein during the first viewing mode, the live active icon is displayed embedded in-line with the text by sizing the live active icon to correspond to an in-line text font size based on a logical rule, and the live active icon dynamically changes based on occurrences outside the electronic word processing document, wherein dynamically changing includes re-rendering the live active icon, wherein an occurrence includes a change in value in the live application;receive a selection of the live active icon;in response to the selection, present in a second viewing mode, an expanded view of the live application;receive a collapse instruction; andin response to the collapse instruction, revert from the second viewing mode to the first viewing mode.
  • 2. The system of claim 1, wherein in the first viewing mode the live active icon has an appearance corresponding to imagery present in the expanded view.
  • 3. The system of claim 1, wherein the at least one processor is further configured to present the second viewing mode in an iframe.
  • 4. The system of claim 1, wherein the interface is configured to enable selection of abridged information to be represented by the live active icon for presentation in the first viewing mode.
  • 5. The system of claim 1, wherein the interface includes a permission tool for enabling selective access restriction to at least one of the live active icon or the expanded view.
  • 6. The system of claim 1, wherein the live active icon includes an animation that plays in-line with the text during the first viewing mode.
  • 7. The system of claim 1, wherein the change in value in the live application comprises a change in value of data from the live application, andthe data from the live application to be represented by the live active icon is based on contextual detection.
  • 8. The system of claim 1, wherein dynamically changing the live active icon includes accessing a data structure to identify an icon manipulation corresponding to the occurrence.
  • 9. The system of claim 1, wherein the live active icon includes a symbol, emblem, sign, or mark.
  • 10. The system of claim 1, wherein the electronic word processing document includes a table.
  • 11. The system of claim 1, wherein the electronic word processing document includes a database.
  • 12. A non-transitory computer readable medium containing instructions that when executed by at least one processor cause the at least one processor to perform operations for causing dynamic activity in an electronic word processing document, the operations comprising: accessing the electronic word processing document;presenting an interface enabling selection of a live application, outside the electronic word processing document, for embedding in the electronic word processing document;embedding, in-line with text of the electronic word processing document, a live active icon representative of the live application;presenting, in a first viewing mode, the live active icon wherein during the first viewing mode, the live active icon is displayed embedded in-line with the text including sizing the live active icon to correspond to an in-line text font size based on a logical rule, and the live active icon dynamically changes based on occurrences outside the electronic word processing document, wherein dynamically changing includes re-rendering the live active icon, wherein an occurrence includes a change in value in the live application;receiving a selection of the live active icon;in response to the selection, presenting in a second viewing mode, an expanded view of the live application;receiving a collapse instruction; andin response to the collapse instruction, reverting from the second viewing mode to the first viewing mode.
  • 13. The non-transitory computer readable medium of claim 12, wherein in the first viewing mode the live active icon has an appearance corresponding to imagery present in the expanded view.
  • 14. The non-transitory computer readable medium of claim 12, wherein the operations further comprise presenting the second viewing mode in an iframe.
  • 15. The non-transitory computer readable medium of claim 12, wherein the interface is configured to enable selection of abridged information to be represented by the live active icon for presentation in the first viewing mode.
  • 16. The non-transitory computer readable medium of claim 12, wherein the interface includes a permission tool for enabling selective access restriction to at least one of the live active icon or the expanded view.
  • 17. The non-transitory computer readable medium of claim 12, wherein the live active icon includes an animation that plays in-line with the text during the first viewing mode.
  • 18. A method for causing dynamic activity in an electronic word processing document, the method comprising: accessing the electronic word processing document;presenting an interface enabling selection of a live application, outside the electronic word processing document, for embedding in the electronic word processing document;embedding, in-line with text of the electronic word processing document, a live active icon representative of the live application;presenting, in a first viewing mode, the live active icon wherein during the first viewing mode, the live active icon is displayed embedded in-line with the text including sizing the live active icon to correspond to an in-line text font size based on a logical rule, and the live active icon dynamically changes based on occurrences outside the electronic word processing document, wherein dynamically changing includes re-rendering the live active icon, wherein an occurrence includes a change in value in the live application;receiving a selection of the live active icon;in response to the selection, presenting in a second viewing mode, an expanded view of the live application;receiving a collapse instruction; andin response to the collapse instruction, reverting from the second viewing mode to the first viewing mode.
  • 19. The method of claim 18, wherein in the first viewing mode the live active icon has an appearance corresponding to imagery present in the expanded view.
  • 20. The method of claim 18, wherein the interface is configured to enable selection of abridged information to be represented by the live active icon for presentation in the first viewing mode.
  • 21. The method of claim 18, wherein the interface includes a permission tool for enabling selective access restriction to at least one of the live active icon or the expanded view.
  • 22. The method of claim 18, wherein the live active icon includes an animation that plays in-line with the text during the first viewing mode.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is based on and claims benefit of priority of International Patent Application No. PCT/IB2021/062440 filed on Dec. 29, 2021, which claims priority to U.S. Provisional Patent Application No. 63/233,925, filed Aug. 17, 2021, U.S. Provisional Patent Application No. 63/273,448, filed Oct. 29, 2021, U.S. Provisional Patent Application No. 63/273,453, filed Oct. 29, 2021, International Patent Application No. PCT/IB2021/000024, filed on Jan. 14, 2021, International Patent Application No. PCT/IB2021/000090, filed on Feb. 11, 2021, and International Patent Application No. PCT/IB2021/000297, filed on Apr. 28, 2021, the contents of all of which are incorporated herein by reference in their entireties.

US Referenced Citations (887)
Number Name Date Kind
4972314 Getzinger et al. Nov 1990 A
5220657 Bly et al. Jun 1993 A
5479602 Baecker et al. Dec 1995 A
5517663 Kahn May 1996 A
5632009 Rao et al. May 1997 A
5682469 Linnett Oct 1997 A
5696702 Skinner et al. Dec 1997 A
5726701 Needham Mar 1998 A
5787411 Groff et al. Jul 1998 A
5880742 Rao et al. Mar 1999 A
5933145 Meek Aug 1999 A
6016438 Wakayama Jan 2000 A
6016553 Schneider et al. Jan 2000 A
6023695 Osborn et al. Feb 2000 A
6034681 Miller Mar 2000 A
6049622 Robb et al. Apr 2000 A
6088707 Bates et al. Jul 2000 A
6108573 Debbins et al. Aug 2000 A
6111573 McComb et al. Aug 2000 A
6157381 Bates et al. Dec 2000 A
6167405 Rosensteel, Jr. et al. Dec 2000 A
6169534 Raffel et al. Jan 2001 B1
6182127 Cronin, III et al. Jan 2001 B1
6185582 Zellweger et al. Feb 2001 B1
6195794 Buxton Feb 2001 B1
6252594 Xia et al. Jun 2001 B1
6266067 Owen et al. Jul 2001 B1
6275809 Tamaki et al. Aug 2001 B1
6330022 Seligmann Dec 2001 B1
6377965 Hachamovitch et al. Apr 2002 B1
6385617 Malik May 2002 B1
6460043 Tabbara et al. Oct 2002 B1
6496832 Chi et al. Dec 2002 B2
6509912 Moran et al. Jan 2003 B1
6510459 Cronin, III et al. Jan 2003 B2
6522347 Tsuji et al. Feb 2003 B1
6527556 Koskinen Mar 2003 B1
6567830 Madduri May 2003 B1
6606740 Lynn et al. Aug 2003 B1
6636242 Bowman-Amuah Oct 2003 B2
6647370 Fu et al. Nov 2003 B1
6661431 Stuart et al. Dec 2003 B1
6988248 Tang Jan 2006 B1
7027997 Robinson et al. Apr 2006 B1
7034860 Lia et al. Apr 2006 B2
7043529 Simonoff May 2006 B1
7054891 Cole May 2006 B2
7228492 Graham Jun 2007 B1
7237188 Leung Jun 2007 B1
7249042 Doerr et al. Jul 2007 B1
7272637 Himmelstein Sep 2007 B1
7274375 David Sep 2007 B1
7379934 Forman et al. May 2008 B1
7383320 Silberstein et al. Jun 2008 B1
7389473 Sawicki et al. Jun 2008 B1
7415664 Aureglia et al. Aug 2008 B2
7417644 Cooper et al. Aug 2008 B2
7461077 Greenwood Dec 2008 B1
7489976 Adra Feb 2009 B2
7617443 Mills et al. Nov 2009 B2
7685152 Chivukula et al. Mar 2010 B2
7707514 Forstall et al. Apr 2010 B2
7710290 Johnson May 2010 B2
7770100 Chamberlain et al. Aug 2010 B2
7827476 Roberts et al. Nov 2010 B1
7827615 Allababidi et al. Nov 2010 B1
7836408 Ollmann et al. Nov 2010 B1
7916157 Kelley et al. Mar 2011 B1
7921360 Sundermeyer et al. Apr 2011 B1
7933952 Parker et al. Apr 2011 B2
7954043 Bera May 2011 B2
7954064 Forstall et al. May 2011 B2
8046703 Busch et al. Oct 2011 B2
8078955 Gupta Dec 2011 B1
8082274 Steinglass et al. Dec 2011 B2
8108241 Shukoor Jan 2012 B2
8136031 Massand Mar 2012 B2
8151213 Weitzman et al. Apr 2012 B2
8223172 Miller et al. Jul 2012 B1
8286072 Chamberlain et al. Oct 2012 B2
8365095 Bansal et al. Jan 2013 B2
8375327 Lorch et al. Feb 2013 B2
8386960 Eismann et al. Feb 2013 B1
8407217 Zhang Mar 2013 B1
8413261 Nemoy et al. Apr 2013 B2
8423909 Zabielski Apr 2013 B2
8543566 Weissman et al. Sep 2013 B2
8548997 Wu Oct 2013 B1
8560942 Fortes et al. Oct 2013 B2
8566732 Louch et al. Oct 2013 B2
8572173 Briere et al. Oct 2013 B2
8578399 Khen et al. Nov 2013 B2
8601383 Folting et al. Dec 2013 B2
8620703 Kapoor et al. Dec 2013 B1
8621652 Slater, Jr. Dec 2013 B2
8635520 Christiansen et al. Jan 2014 B2
8677448 Kauffman et al. Mar 2014 B1
8738414 Nagar et al. May 2014 B1
8812471 Akita Aug 2014 B2
8819042 Samudrala et al. Aug 2014 B2
8825758 Bailor et al. Sep 2014 B2
8838533 Kwiatkowski et al. Sep 2014 B2
8862979 Hawking Oct 2014 B2
8863022 Rhodes et al. Oct 2014 B2
8869027 Louch et al. Oct 2014 B2
8937627 Otero et al. Jan 2015 B1
8938465 Messer Jan 2015 B2
8954871 Louch et al. Feb 2015 B2
9007405 Eldar et al. Apr 2015 B1
9015716 Fletcher et al. Apr 2015 B2
9026897 Zarras May 2015 B2
9043362 Weissman et al. May 2015 B2
9063958 Müller et al. Jun 2015 B2
9129234 Campbell et al. Sep 2015 B2
9159246 Rodriguez et al. Oct 2015 B2
9172738 daCosta Oct 2015 B1
9177238 Windmueller et al. Nov 2015 B2
9183303 Goel et al. Nov 2015 B1
9223770 Ledet Dec 2015 B1
9239719 Feinstein et al. Jan 2016 B1
9244917 Sharma et al. Jan 2016 B1
9253130 Zaveri Feb 2016 B2
9286246 Saito et al. Mar 2016 B2
9286475 Li et al. Mar 2016 B2
9292587 Kann et al. Mar 2016 B2
9336502 Mohammad et al. May 2016 B2
9342579 Cao et al. May 2016 B2
9361287 Simon et al. Jun 2016 B1
9390059 Gur et al. Jul 2016 B1
9424287 Schroth Aug 2016 B2
9424333 Bisignani et al. Aug 2016 B1
9424545 Lee Aug 2016 B1
9430458 Rhee et al. Aug 2016 B2
9449031 Barrus et al. Sep 2016 B2
9495386 Tapley et al. Nov 2016 B2
9519699 Kulkarni et al. Dec 2016 B1
9558172 Rampson et al. Jan 2017 B2
9613086 Sherman Apr 2017 B1
9635091 Laukkanen et al. Apr 2017 B1
9659284 Wilson et al. May 2017 B1
9679456 East Jun 2017 B2
9720602 Chen et al. Aug 2017 B1
9727376 Bills et al. Aug 2017 B1
9760271 Persaud Sep 2017 B2
9794256 Kiang et al. Oct 2017 B2
9798829 Baisley Oct 2017 B1
9811676 Gauvin Nov 2017 B1
9866561 Psenka et al. Jan 2018 B2
9870136 Pourshahid Jan 2018 B2
10043296 Li Aug 2018 B2
10067928 Krappe Sep 2018 B1
10078668 Woodrow et al. Sep 2018 B1
10169306 O'Shaughnessy et al. Jan 2019 B2
10176154 Ben-Aharon et al. Jan 2019 B2
10235441 Makhlin et al. Mar 2019 B1
10255609 Kinkead et al. Apr 2019 B2
10282405 Silk et al. May 2019 B1
10282406 Bissantz May 2019 B2
10311080 Folting et al. Jun 2019 B2
10318624 Rosner et al. Jun 2019 B1
10327712 Beymer et al. Jun 2019 B2
10347017 Ruble et al. Jul 2019 B2
10372706 Chavan et al. Aug 2019 B2
10380140 Sherman Aug 2019 B2
10423758 Kido et al. Sep 2019 B2
10445702 Hunt Oct 2019 B1
10452360 Burman et al. Oct 2019 B1
10453118 Smith et al. Oct 2019 B2
10474317 Ramanathan et al. Nov 2019 B2
10489391 Tomlin Nov 2019 B1
10489462 Rogynskyy et al. Nov 2019 B1
10496737 Sayre et al. Dec 2019 B1
10505825 Bettaiah et al. Dec 2019 B1
10528599 Pandis et al. Jan 2020 B1
10534507 Laukkanen et al. Jan 2020 B1
10540152 Krishnaswamy et al. Jan 2020 B1
10540434 Migeon et al. Jan 2020 B2
10546001 Nguyen et al. Jan 2020 B1
10564622 Dean et al. Feb 2020 B1
10573407 Ginsburg Feb 2020 B2
10579724 Campbell et al. Mar 2020 B2
10587714 Kulkarni et al. Mar 2020 B1
10628002 Kang et al. Apr 2020 B1
10698594 Sanches et al. Jun 2020 B2
10706061 Sherman et al. Jul 2020 B2
10719220 Ouellet et al. Jul 2020 B2
10733256 Fickenscher et al. Aug 2020 B2
10740117 Ording et al. Aug 2020 B2
10747764 Plenderleith Aug 2020 B1
10747950 Dang et al. Aug 2020 B2
10748312 Ruble et al. Aug 2020 B2
10754688 Powell Aug 2020 B2
10761691 Anzures et al. Sep 2020 B2
10795555 Burke et al. Oct 2020 B2
10809696 Principato Oct 2020 B1
10817660 Rampson et al. Oct 2020 B2
D910077 Naroshevitch et al. Feb 2021 S
10963578 More et al. Mar 2021 B2
11010371 Slomka et al. May 2021 B1
11030259 Mullins et al. Jun 2021 B2
11042363 Krishnaswamy et al. Jun 2021 B1
11042699 Sayre et al. Jun 2021 B1
11048714 Sherman et al. Jun 2021 B2
11086894 Srivastava et al. Aug 2021 B1
11144854 Mouawad Oct 2021 B1
11222167 Gehrmann et al. Jan 2022 B2
11243688 Remy et al. Feb 2022 B1
11429384 Navert et al. Aug 2022 B1
11443390 Caligaris et al. Sep 2022 B1
11620615 Jiang et al. Apr 2023 B2
11682091 Sukman et al. Jun 2023 B2
20010008998 Tamaki et al. Jul 2001 A1
20010032248 Krafchin Oct 2001 A1
20010039551 Saito et al. Nov 2001 A1
20020002459 Lewis et al. Jan 2002 A1
20020065848 Walker et al. May 2002 A1
20020065849 Ferguson et al. May 2002 A1
20020065880 Hasegawa et al. May 2002 A1
20020069207 Alexander et al. Jun 2002 A1
20020075309 Michelman et al. Jun 2002 A1
20020082892 Raffel et al. Jun 2002 A1
20020099777 Gupta et al. Jul 2002 A1
20020138528 Gong et al. Sep 2002 A1
20030033196 Tomlin Feb 2003 A1
20030041113 Larsen Feb 2003 A1
20030051377 Chirafesi, Jr. Mar 2003 A1
20030058277 Bowman-Amuah Mar 2003 A1
20030065662 Cosic Apr 2003 A1
20030093408 Brown et al. May 2003 A1
20030101416 McInnes et al. May 2003 A1
20030135558 Bellotti et al. Jul 2003 A1
20030137536 Hugh Jul 2003 A1
20030187864 McGoveran Oct 2003 A1
20030200215 Chen et al. Oct 2003 A1
20030204490 Kasriel Oct 2003 A1
20030233224 Marchisio et al. Dec 2003 A1
20040032432 Baynger Feb 2004 A1
20040098284 Petito et al. May 2004 A1
20040133441 Brady et al. Jul 2004 A1
20040138939 Theiler Jul 2004 A1
20040139400 Allam Jul 2004 A1
20040162833 Jones et al. Aug 2004 A1
20040172592 Collie et al. Sep 2004 A1
20040212615 Uthe Oct 2004 A1
20040215443 Hatton Oct 2004 A1
20040230940 Cooper et al. Nov 2004 A1
20040268227 Brid Dec 2004 A1
20050034058 Mills et al. Feb 2005 A1
20050034064 Meyers et al. Feb 2005 A1
20050039001 Hudis et al. Feb 2005 A1
20050039033 Meyers et al. Feb 2005 A1
20050044486 Kotler et al. Feb 2005 A1
20050063615 Siegel et al. Mar 2005 A1
20050066306 Diab Mar 2005 A1
20050086360 Mamou et al. Apr 2005 A1
20050091314 Blagsvedt et al. Apr 2005 A1
20050091596 Anthony et al. Apr 2005 A1
20050096973 Heyse et al. May 2005 A1
20050114305 Haynes et al. May 2005 A1
20050125395 Boettiger Jun 2005 A1
20050165600 Kasravi et al. Jul 2005 A1
20050171881 Ghassemieh et al. Aug 2005 A1
20050216830 Turner et al. Sep 2005 A1
20050228250 Bitter et al. Oct 2005 A1
20050251021 Kaufman et al. Nov 2005 A1
20050257204 Bryant et al. Nov 2005 A1
20050278297 Nelson Dec 2005 A1
20050289170 Brown et al. Dec 2005 A1
20050289342 Needham et al. Dec 2005 A1
20050289453 Segal et al. Dec 2005 A1
20060009960 Valencot et al. Jan 2006 A1
20060013462 Sadikali Jan 2006 A1
20060015499 Clissold et al. Jan 2006 A1
20060015806 Wallace Jan 2006 A1
20060031148 O'Dell et al. Feb 2006 A1
20060031764 Keyser et al. Feb 2006 A1
20060036568 Moore Feb 2006 A1
20060047811 Lau et al. Mar 2006 A1
20060053096 Subramanian et al. Mar 2006 A1
20060053194 Schneider et al. Mar 2006 A1
20060069604 Leukart et al. Mar 2006 A1
20060069635 Ram et al. Mar 2006 A1
20060080594 Chavoustie et al. Apr 2006 A1
20060090169 Daniels et al. Apr 2006 A1
20060106642 Reicher et al. May 2006 A1
20060107196 Thanu et al. May 2006 A1
20060111953 Setya May 2006 A1
20060129415 Thukral et al. Jun 2006 A1
20060136828 Asano Jun 2006 A1
20060150090 Swamidass Jul 2006 A1
20060173908 Browning et al. Aug 2006 A1
20060190313 Lu Aug 2006 A1
20060212299 Law Sep 2006 A1
20060224542 Yalamanchi Oct 2006 A1
20060224568 Debrito Oct 2006 A1
20060224946 Barrett et al. Oct 2006 A1
20060236246 Bono et al. Oct 2006 A1
20060250369 Keim Nov 2006 A1
20060253205 Gardiner Nov 2006 A1
20060271574 Villaron et al. Nov 2006 A1
20060287998 Folting et al. Dec 2006 A1
20060294451 Kelkar et al. Dec 2006 A1
20070027932 Thibeault Feb 2007 A1
20070033531 Marsh Feb 2007 A1
20070050322 Vigesaa et al. Mar 2007 A1
20070050379 Day et al. Mar 2007 A1
20070073899 Judge et al. Mar 2007 A1
20070092048 Chelstrom et al. Apr 2007 A1
20070094607 Morgan et al. Apr 2007 A1
20070101291 Forstall et al. May 2007 A1
20070106754 Moore May 2007 A1
20070118527 Winje et al. May 2007 A1
20070118813 Forstall et al. May 2007 A1
20070143169 Grant et al. Jun 2007 A1
20070168861 Bell et al. Jul 2007 A1
20070174228 Folting et al. Jul 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070186173 Both et al. Aug 2007 A1
20070220119 Himmelstein Sep 2007 A1
20070233647 Rawat et al. Oct 2007 A1
20070239746 Masselle et al. Oct 2007 A1
20070256043 Peters et al. Nov 2007 A1
20070282522 Geelen Dec 2007 A1
20070282627 Greenstein et al. Dec 2007 A1
20070283259 Barry et al. Dec 2007 A1
20070294235 Millett Dec 2007 A1
20070299795 Macbeth et al. Dec 2007 A1
20070300174 Macbeth et al. Dec 2007 A1
20070300185 Macbeth et al. Dec 2007 A1
20080004929 Raffel et al. Jan 2008 A9
20080005235 Hegde et al. Jan 2008 A1
20080033777 Shukoor Feb 2008 A1
20080034307 Cisler et al. Feb 2008 A1
20080034314 Louch et al. Feb 2008 A1
20080052291 Bender Feb 2008 A1
20080059312 Gern et al. Mar 2008 A1
20080059539 Chin et al. Mar 2008 A1
20080065460 Raynor Mar 2008 A1
20080077530 Banas et al. Mar 2008 A1
20080097748 Haley et al. Apr 2008 A1
20080104091 Chin May 2008 A1
20080126389 Mush et al. May 2008 A1
20080133736 Wensley et al. Jun 2008 A1
20080148140 Nakano Jun 2008 A1
20080155547 Weber et al. Jun 2008 A1
20080163075 Beck et al. Jul 2008 A1
20080183593 Dierks Jul 2008 A1
20080195948 Bauer Aug 2008 A1
20080209318 Allsop et al. Aug 2008 A1
20080216022 Lorch et al. Sep 2008 A1
20080222192 Hughes Sep 2008 A1
20080256014 Gould et al. Oct 2008 A1
20080256429 Penner et al. Oct 2008 A1
20080270597 Tenenti Oct 2008 A1
20080282189 Hofmann et al. Nov 2008 A1
20080295038 Helfman et al. Nov 2008 A1
20080301237 Parsons Dec 2008 A1
20090006171 Blatchley et al. Jan 2009 A1
20090006283 Labrie et al. Jan 2009 A1
20090013244 Cudich et al. Jan 2009 A1
20090019383 Riley et al. Jan 2009 A1
20090024944 Louch et al. Jan 2009 A1
20090043814 Faris et al. Feb 2009 A1
20090044090 Gur et al. Feb 2009 A1
20090048896 Anandan Feb 2009 A1
20090049372 Goldberg Feb 2009 A1
20090075694 Kim et al. Mar 2009 A1
20090077164 Phillips et al. Mar 2009 A1
20090077217 McFarland et al. Mar 2009 A1
20090083140 Phan Mar 2009 A1
20090094514 Dargahi et al. Apr 2009 A1
20090113310 Appleyard et al. Apr 2009 A1
20090129596 Chavez et al. May 2009 A1
20090132331 Cartledge et al. May 2009 A1
20090132470 Vignet May 2009 A1
20090150813 Chang et al. Jun 2009 A1
20090174680 Anzures et al. Jul 2009 A1
20090192787 Roon Jul 2009 A1
20090198715 Barbarek Aug 2009 A1
20090222760 Halverson et al. Sep 2009 A1
20090248710 McCormack et al. Oct 2009 A1
20090256972 Ramaswamy et al. Oct 2009 A1
20090271696 Bailor et al. Oct 2009 A1
20090276692 Rosner Nov 2009 A1
20090292690 Culbert Nov 2009 A1
20090313201 Huelsman et al. Dec 2009 A1
20090313537 Fu et al. Dec 2009 A1
20090313570 Po et al. Dec 2009 A1
20090319623 Srinivasan et al. Dec 2009 A1
20090319882 Morrison et al. Dec 2009 A1
20090327240 Meehan et al. Dec 2009 A1
20090327301 Lees et al. Dec 2009 A1
20090327851 Raposo Dec 2009 A1
20090327875 Kinkoh Dec 2009 A1
20100017699 Farrell et al. Jan 2010 A1
20100031135 Naghshin et al. Feb 2010 A1
20100070845 Facemire et al. Mar 2010 A1
20100070895 Messer Mar 2010 A1
20100083164 Martin et al. Apr 2010 A1
20100088636 Yerkes et al. Apr 2010 A1
20100095219 Stachowiak et al. Apr 2010 A1
20100095298 Seshadrinathan et al. Apr 2010 A1
20100100427 McKeown et al. Apr 2010 A1
20100100463 Molotsi et al. Apr 2010 A1
20100114926 Agrawal et al. May 2010 A1
20100149005 Yoon et al. Jun 2010 A1
20100174678 Massand Jul 2010 A1
20100205521 Folting Aug 2010 A1
20100228752 Folting et al. Sep 2010 A1
20100241477 Nylander et al. Sep 2010 A1
20100241948 Andeen et al. Sep 2010 A1
20100241968 Tarara et al. Sep 2010 A1
20100241972 Spataro et al. Sep 2010 A1
20100241990 Gabriel et al. Sep 2010 A1
20100251090 Chamberlain et al. Sep 2010 A1
20100251386 Gilzean et al. Sep 2010 A1
20100257015 Molander Oct 2010 A1
20100262625 Pittenger Oct 2010 A1
20100287163 Sridhar et al. Nov 2010 A1
20100287221 Battepati et al. Nov 2010 A1
20100313119 Baldwin et al. Dec 2010 A1
20100324964 Callanan et al. Dec 2010 A1
20100332973 Kloiber et al. Dec 2010 A1
20110010340 Hung et al. Jan 2011 A1
20110016432 Helfman Jan 2011 A1
20110028138 Davies-Moore et al. Feb 2011 A1
20110047484 Mount et al. Feb 2011 A1
20110055177 Chakra et al. Mar 2011 A1
20110066933 Ludwig Mar 2011 A1
20110071869 O'Brien et al. Mar 2011 A1
20110106636 Spear et al. May 2011 A1
20110119352 Perov et al. May 2011 A1
20110154192 Yang et al. Jun 2011 A1
20110179371 Kopycinski et al. Jul 2011 A1
20110205231 Hartley et al. Aug 2011 A1
20110208324 Fukatsu Aug 2011 A1
20110208732 Melton et al. Aug 2011 A1
20110209150 Hammond et al. Aug 2011 A1
20110219321 Gonzalez et al. Sep 2011 A1
20110225525 Chasman et al. Sep 2011 A1
20110231273 Buchheit Sep 2011 A1
20110238716 Amir et al. Sep 2011 A1
20110258040 Ghanasambandam Oct 2011 A1
20110288900 McQueen et al. Nov 2011 A1
20110289397 Eastmond et al. Nov 2011 A1
20110289439 Jugel Nov 2011 A1
20110298618 Stahl et al. Dec 2011 A1
20110302003 Shirish et al. Dec 2011 A1
20120029962 Podgurny et al. Feb 2012 A1
20120035974 Seybold Feb 2012 A1
20120036423 Haynes et al. Feb 2012 A1
20120036462 Schwartz et al. Feb 2012 A1
20120050802 Masuda Mar 2012 A1
20120066587 Zhou et al. Mar 2012 A1
20120072821 Bowling Mar 2012 A1
20120079408 Rohwer Mar 2012 A1
20120081762 Yamada Apr 2012 A1
20120084798 Reeves et al. Apr 2012 A1
20120086716 Reeves et al. Apr 2012 A1
20120086717 Liu Apr 2012 A1
20120089610 Agrawal et al. Apr 2012 A1
20120089914 Holt et al. Apr 2012 A1
20120089992 Reeves et al. Apr 2012 A1
20120096389 Flam et al. Apr 2012 A1
20120096392 Ording et al. Apr 2012 A1
20120102432 Breedvelt-Schouten et al. Apr 2012 A1
20120102543 Kohli et al. Apr 2012 A1
20120110515 Abramoff et al. May 2012 A1
20120116834 Pope et al. May 2012 A1
20120116835 Pope et al. May 2012 A1
20120124749 Lewman May 2012 A1
20120130907 Thompson et al. May 2012 A1
20120131445 Oyarzabal et al. May 2012 A1
20120151173 Shirley et al. Jun 2012 A1
20120158744 Tseng et al. Jun 2012 A1
20120192050 Campbell et al. Jul 2012 A1
20120198322 Gulwani et al. Aug 2012 A1
20120210252 Fedoseyeva et al. Aug 2012 A1
20120215574 Driessnack et al. Aug 2012 A1
20120215578 Swierz, III et al. Aug 2012 A1
20120229867 Takagi Sep 2012 A1
20120233150 Naim et al. Sep 2012 A1
20120233533 Yücel et al. Sep 2012 A1
20120234907 Clark et al. Sep 2012 A1
20120236368 Uchida et al. Sep 2012 A1
20120244891 Appleton Sep 2012 A1
20120246170 Iantorno Sep 2012 A1
20120254252 Jin et al. Oct 2012 A1
20120254770 Ophir Oct 2012 A1
20120260190 Berger et al. Oct 2012 A1
20120278117 Nguyen et al. Nov 2012 A1
20120284197 Strick et al. Nov 2012 A1
20120297307 Rider et al. Nov 2012 A1
20120300931 Ollikainen et al. Nov 2012 A1
20120303262 Alam et al. Nov 2012 A1
20120304098 Kuulusa Nov 2012 A1
20120311496 Cao et al. Dec 2012 A1
20120311672 Connor et al. Dec 2012 A1
20120324348 Rounthwaite Dec 2012 A1
20130015954 Thorne et al. Jan 2013 A1
20130018952 McConnell et al. Jan 2013 A1
20130018953 McConnell et al. Jan 2013 A1
20130018960 Knysz et al. Jan 2013 A1
20130024418 Strick et al. Jan 2013 A1
20130024760 Vogel et al. Jan 2013 A1
20130036369 Mitchell et al. Feb 2013 A1
20130041958 Post et al. Feb 2013 A1
20130054514 Barrett-Kahn et al. Feb 2013 A1
20130055113 Chazin et al. Feb 2013 A1
20130059598 Miyagi et al. Mar 2013 A1
20130063490 Zaman et al. Mar 2013 A1
20130086460 Folting et al. Apr 2013 A1
20130090969 Rivere Apr 2013 A1
20130097490 Kotler et al. Apr 2013 A1
20130103417 Seto et al. Apr 2013 A1
20130104035 Wagner et al. Apr 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117268 Smith et al. May 2013 A1
20130159832 Ingargiola et al. Jun 2013 A1
20130159907 Brosche et al. Jun 2013 A1
20130179209 Milosevich Jul 2013 A1
20130211866 Gordon et al. Aug 2013 A1
20130212197 Karlson Aug 2013 A1
20130212234 Bartlett et al. Aug 2013 A1
20130215475 Noguchi Aug 2013 A1
20130238363 Ohta et al. Sep 2013 A1
20130238968 Barrus Sep 2013 A1
20130246384 Victor Sep 2013 A1
20130262527 Hunter Oct 2013 A1
20130268331 Bitz et al. Oct 2013 A1
20130297468 Hirsch et al. Nov 2013 A1
20130307997 O'Keefe et al. Nov 2013 A1
20130318424 Boyd Nov 2013 A1
20130339051 Dobrean Dec 2013 A1
20140002863 Hasegawa et al. Jan 2014 A1
20140006326 Bazanov Jan 2014 A1
20140012616 Moshenek Jan 2014 A1
20140019842 Montagna et al. Jan 2014 A1
20140033307 Schmidtler Jan 2014 A1
20140043331 Makinen et al. Feb 2014 A1
20140046638 Peloski Feb 2014 A1
20140052749 Rissanen Feb 2014 A1
20140058801 Deodhar et al. Feb 2014 A1
20140059017 Chaney et al. Feb 2014 A1
20140068403 Bhargav et al. Mar 2014 A1
20140074545 Minder et al. Mar 2014 A1
20140075301 Mihara Mar 2014 A1
20140078557 Hasegawa et al. Mar 2014 A1
20140082525 Kass et al. Mar 2014 A1
20140095237 Ehrler et al. Apr 2014 A1
20140101527 Suciu Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140109012 Choudhary et al. Apr 2014 A1
20140111516 Hall et al. Apr 2014 A1
20140115515 Adams et al. Apr 2014 A1
20140115518 Abdukalykov et al. Apr 2014 A1
20140129960 Wang et al. May 2014 A1
20140136972 Rodgers et al. May 2014 A1
20140137003 Peters et al. May 2014 A1
20140137144 J{umlaut over (r)}venpää et al. May 2014 A1
20140172475 Olliphant et al. Jun 2014 A1
20140173401 Oshlag et al. Jun 2014 A1
20140181155 Homsany Jun 2014 A1
20140188748 Cavoue et al. Jul 2014 A1
20140195933 Rao Dv Jul 2014 A1
20140214404 Kalia et al. Jul 2014 A1
20140215303 Grigorovitch et al. Jul 2014 A1
20140229816 Yakub Aug 2014 A1
20140240735 Salgado Aug 2014 A1
20140249877 Hull et al. Sep 2014 A1
20140257568 Czaja et al. Sep 2014 A1
20140278638 Kreuzkamp et al. Sep 2014 A1
20140278720 Taguchi Sep 2014 A1
20140280287 Ganti Sep 2014 A1
20140280377 Frew Sep 2014 A1
20140281868 Vogel et al. Sep 2014 A1
20140281869 Yob Sep 2014 A1
20140289223 Colwell et al. Sep 2014 A1
20140304174 Scott et al. Oct 2014 A1
20140306837 Hauck, III Oct 2014 A1
20140310345 Megiddo et al. Oct 2014 A1
20140324497 Verma et al. Oct 2014 A1
20140324501 Davidow et al. Oct 2014 A1
20140325552 Evans et al. Oct 2014 A1
20140365938 Black et al. Dec 2014 A1
20140372856 Radakovitz et al. Dec 2014 A1
20140372932 Rutherford et al. Dec 2014 A1
20150032686 Kuchoor Jan 2015 A1
20150033131 Peev et al. Jan 2015 A1
20150033149 Kuchoor Jan 2015 A1
20150035918 Matsumoto et al. Feb 2015 A1
20150046209 Choe Feb 2015 A1
20150067556 Tibrewal et al. Mar 2015 A1
20150074721 Fishman et al. Mar 2015 A1
20150074728 Chai et al. Mar 2015 A1
20150088822 Raja et al. Mar 2015 A1
20150095752 Studer et al. Apr 2015 A1
20150106736 Torman et al. Apr 2015 A1
20150125834 Mendoza May 2015 A1
20150142676 McGinnis et al. May 2015 A1
20150142829 Lee et al. May 2015 A1
20150153943 Wang Jun 2015 A1
20150154660 Weald et al. Jun 2015 A1
20150169514 Sah et al. Jun 2015 A1
20150169531 Campbell et al. Jun 2015 A1
20150188964 Sharma et al. Jul 2015 A1
20150212717 Nair et al. Jul 2015 A1
20150220491 Cochrane et al. Aug 2015 A1
20150234887 Greene et al. Aug 2015 A1
20150242091 Lu et al. Aug 2015 A1
20150249864 Tang et al. Sep 2015 A1
20150261796 Gould et al. Sep 2015 A1
20150262121 Riel-Dalpe et al. Sep 2015 A1
20150278699 Danielsson Oct 2015 A1
20150281292 Murayama Oct 2015 A1
20150295877 Roman Oct 2015 A1
20150310126 Steiner et al. Oct 2015 A1
20150317590 Karlson Nov 2015 A1
20150324453 Werner Nov 2015 A1
20150331846 Guggilla et al. Nov 2015 A1
20150363478 Haynes Dec 2015 A1
20150370540 Coslovi et al. Dec 2015 A1
20150370904 Joshi et al. Dec 2015 A1
20150378542 Saito et al. Dec 2015 A1
20150378711 Cameron et al. Dec 2015 A1
20150378979 Hirzel et al. Dec 2015 A1
20150379472 Gilmour et al. Dec 2015 A1
20160012111 Pattabhiraman et al. Jan 2016 A1
20160018962 Low et al. Jan 2016 A1
20160026939 Schiffer et al. Jan 2016 A1
20160027076 Jackson et al. Jan 2016 A1
20160035546 Platt et al. Feb 2016 A1
20160055134 Sathish et al. Feb 2016 A1
20160055374 Zhang et al. Feb 2016 A1
20160063435 Shah et al. Mar 2016 A1
20160068960 Jung et al. Mar 2016 A1
20160078368 Kakhandiki et al. Mar 2016 A1
20160088480 Chen et al. Mar 2016 A1
20160092557 Stojanovic et al. Mar 2016 A1
20160098574 Bargagni Apr 2016 A1
20160117308 Haider et al. Apr 2016 A1
20160170586 Gallo Jun 2016 A1
20160173122 Akitomi et al. Jun 2016 A1
20160210572 Shaaban et al. Jul 2016 A1
20160224532 Miller et al. Aug 2016 A1
20160224939 Chen et al. Aug 2016 A1
20160231915 Nhan et al. Aug 2016 A1
20160232489 Skaaksrud Aug 2016 A1
20160246490 Cabral Aug 2016 A1
20160253982 Cheung et al. Sep 2016 A1
20160259856 Ananthapur et al. Sep 2016 A1
20160275150 Bournonnais et al. Sep 2016 A1
20160299655 Migos et al. Oct 2016 A1
20160308963 Kung Oct 2016 A1
20160321235 He et al. Nov 2016 A1
20160321604 Imaeda et al. Nov 2016 A1
20160335302 Teodorescu Nov 2016 A1
20160335303 Madhalam et al. Nov 2016 A1
20160335604 Reminick et al. Nov 2016 A1
20160335731 Hall Nov 2016 A1
20160335903 Mendoza Nov 2016 A1
20160344828 Häusler et al. Nov 2016 A1
20160350950 Ritchie et al. Dec 2016 A1
20160381099 Keslin et al. Dec 2016 A1
20170017779 Huang et al. Jan 2017 A1
20170031967 Chavan et al. Feb 2017 A1
20170041296 Ford et al. Feb 2017 A1
20170052937 Sirven et al. Feb 2017 A1
20170061342 LoRe et al. Mar 2017 A1
20170061360 Rucker et al. Mar 2017 A1
20170061820 Firoozbakhsh Mar 2017 A1
20170063722 Cropper et al. Mar 2017 A1
20170075557 Noble et al. Mar 2017 A1
20170076101 Kochhar et al. Mar 2017 A1
20170090734 Fitzpatrick Mar 2017 A1
20170090736 King et al. Mar 2017 A1
20170091337 Patterson Mar 2017 A1
20170093876 Feng et al. Mar 2017 A1
20170109499 Doshi et al. Apr 2017 A1
20170111327 Wu Apr 2017 A1
20170116552 Deodhar et al. Apr 2017 A1
20170124042 Campbell et al. May 2017 A1
20170124048 Campbell et al. May 2017 A1
20170124055 Radakovitz et al. May 2017 A1
20170124740 Campbell May 2017 A1
20170126772 Campbell et al. May 2017 A1
20170132296 Ding May 2017 A1
20170132652 Kedzlie et al. May 2017 A1
20170139874 Chin May 2017 A1
20170139884 Bendig et al. May 2017 A1
20170139891 Ah-Soon et al. May 2017 A1
20170140047 Bendig et al. May 2017 A1
20170140219 King et al. May 2017 A1
20170153771 Chu Jun 2017 A1
20170161246 Klima Jun 2017 A1
20170177556 Fay et al. Jun 2017 A1
20170177888 Arora et al. Jun 2017 A1
20170185575 Sood et al. Jun 2017 A1
20170185668 Convertino et al. Jun 2017 A1
20170200122 Edson et al. Jul 2017 A1
20170206366 Fay et al. Jul 2017 A1
20170212924 Semlani et al. Jul 2017 A1
20170220813 Mullins et al. Aug 2017 A1
20170221072 AthuluruTlrumala et al. Aug 2017 A1
20170228421 Sharma et al. Aug 2017 A1
20170228445 Chiu et al. Aug 2017 A1
20170228460 Amel Aug 2017 A1
20170229152 Loganathan et al. Aug 2017 A1
20170236081 Grady Smith et al. Aug 2017 A1
20170242921 Rota Aug 2017 A1
20170257517 Panda Sep 2017 A1
20170262786 Khasis Sep 2017 A1
20170270970 Ho et al. Sep 2017 A1
20170272316 Johnson et al. Sep 2017 A1
20170272331 Lissack Sep 2017 A1
20170277620 Kadioglu Sep 2017 A1
20170277669 Sekharan Sep 2017 A1
20170285879 Pilkington et al. Oct 2017 A1
20170285890 Dolman Oct 2017 A1
20170289619 Xu et al. Oct 2017 A1
20170301039 Dyer et al. Oct 2017 A1
20170315683 Boucher et al. Nov 2017 A1
20170315974 Kong et al. Nov 2017 A1
20170315979 Boucher et al. Nov 2017 A1
20170324692 Zhou Nov 2017 A1
20170329479 Rauschenbach et al. Nov 2017 A1
20170351252 Kleifges et al. Dec 2017 A1
20170372442 Mejias Dec 2017 A1
20170374205 Panda Dec 2017 A1
20180011827 Avery et al. Jan 2018 A1
20180025084 Conlan et al. Jan 2018 A1
20180026954 Toepke et al. Jan 2018 A1
20180032492 Altshuller et al. Feb 2018 A1
20180032570 Miller et al. Feb 2018 A1
20180039651 Tobin et al. Feb 2018 A1
20180055434 Cheung et al. Mar 2018 A1
20180075104 Oberbreckling et al. Mar 2018 A1
20180075115 Murray et al. Mar 2018 A1
20180075413 Culver et al. Mar 2018 A1
20180075560 Thukral et al. Mar 2018 A1
20180081863 Bathla Mar 2018 A1
20180081868 Willcock et al. Mar 2018 A1
20180088753 Viégas et al. Mar 2018 A1
20180088989 Nield et al. Mar 2018 A1
20180089299 Collins et al. Mar 2018 A1
20180095938 Monte Apr 2018 A1
20180096417 Cook et al. Apr 2018 A1
20180109760 Metter et al. Apr 2018 A1
20180121028 Kuscher et al. May 2018 A1
20180121994 Matsunaga et al. May 2018 A1
20180128636 Zhou May 2018 A1
20180129651 Latvala et al. May 2018 A1
20180157455 Troy et al. Jun 2018 A1
20180157467 Stachura Jun 2018 A1
20180157468 Stachura Jun 2018 A1
20180157633 He et al. Jun 2018 A1
20180173715 Dunne Jun 2018 A1
20180181650 Komatsuda et al. Jun 2018 A1
20180181716 Mander et al. Jun 2018 A1
20180189734 Newhouse et al. Jul 2018 A1
20180210936 Reynolds et al. Jul 2018 A1
20180225270 Bhide et al. Aug 2018 A1
20180260371 Theodore et al. Sep 2018 A1
20180276417 Cerezo Sep 2018 A1
20180285918 Staggs Oct 2018 A1
20180293217 Callaghan Oct 2018 A1
20180293587 Oda Oct 2018 A1
20180293669 Jackson et al. Oct 2018 A1
20180329930 Eberlein et al. Nov 2018 A1
20180330320 Kohli Nov 2018 A1
20180357305 Kinast et al. Dec 2018 A1
20180365429 Segal Dec 2018 A1
20180367484 Rodriguez et al. Dec 2018 A1
20180373434 Switzer et al. Dec 2018 A1
20180373757 Schukovets et al. Dec 2018 A1
20190005094 Yi Jan 2019 A1
20190012342 Cohn Jan 2019 A1
20190034395 Curry et al. Jan 2019 A1
20190036989 Eirinberg et al. Jan 2019 A1
20190042628 Rajpara Feb 2019 A1
20190050445 Griffith et al. Feb 2019 A1
20190050466 Kim et al. Feb 2019 A1
20190050812 Boileau Feb 2019 A1
20190056856 Simmons et al. Feb 2019 A1
20190065545 Hazel et al. Feb 2019 A1
20190068703 Vora et al. Feb 2019 A1
20190073350 Shiotani Mar 2019 A1
20190095413 Davis et al. Mar 2019 A1
20190097909 Puri et al. Mar 2019 A1
20190108046 Spencer-Harper et al. Apr 2019 A1
20190113935 Kuo et al. Apr 2019 A1
20190114308 Hancock Apr 2019 A1
20190114589 Voss et al. Apr 2019 A1
20190123924 Embiricos et al. Apr 2019 A1
20190130611 Black et al. May 2019 A1
20190138588 Silk et al. May 2019 A1
20190138653 Roller et al. May 2019 A1
20190147030 Stein et al. May 2019 A1
20190155821 Dirisala May 2019 A1
20190179501 Seeley et al. Jun 2019 A1
20190208058 Dvorkin et al. Jul 2019 A1
20190213557 Dotan-Cohen et al. Jul 2019 A1
20190220161 Loftus et al. Jul 2019 A1
20190236188 McKenna Aug 2019 A1
20190243879 Harley et al. Aug 2019 A1
20190251884 Burns et al. Aug 2019 A1
20190258461 Li et al. Aug 2019 A1
20190258706 Li et al. Aug 2019 A1
20190286839 Mutha et al. Sep 2019 A1
20190306009 Makovsky et al. Oct 2019 A1
20190324840 Malamut et al. Oct 2019 A1
20190325012 Delaney et al. Oct 2019 A1
20190340550 Denger et al. Nov 2019 A1
20190347077 Huebra Nov 2019 A1
20190361879 Rogynskyy et al. Nov 2019 A1
20190361971 Zenger et al. Nov 2019 A1
20190364009 Joseph et al. Nov 2019 A1
20190371442 Schoenberg Dec 2019 A1
20190377791 Mahmoud et al. Dec 2019 A1
20190391707 Ristow et al. Dec 2019 A1
20200005248 Gerzi et al. Jan 2020 A1
20200005295 Murphy Jan 2020 A1
20200012629 Lereya et al. Jan 2020 A1
20200019548 Agnew et al. Jan 2020 A1
20200019595 Azua Jan 2020 A1
20200026352 Wang et al. Jan 2020 A1
20200026397 Wohlstadter et al. Jan 2020 A1
20200042648 Rao Feb 2020 A1
20200050696 Mowatt et al. Feb 2020 A1
20200053176 Jimenez et al. Feb 2020 A1
20200125574 Ghoshal et al. Apr 2020 A1
20200134002 Tung et al. Apr 2020 A1
20200142546 Breedvelt-Schouten et al. May 2020 A1
20200151630 Shakhnovich May 2020 A1
20200159558 Bak et al. May 2020 A1
20200175094 Palmer Jun 2020 A1
20200192785 Chen Jun 2020 A1
20200247661 Rao et al. Aug 2020 A1
20200265112 Fox et al. Aug 2020 A1
20200293616 Nelson et al. Sep 2020 A1
20200301678 Burman et al. Sep 2020 A1
20200301902 Maloy et al. Sep 2020 A1
20200310835 Momchilov Oct 2020 A1
20200326824 Alonso et al. Oct 2020 A1
20200327244 Blass et al. Oct 2020 A1
20200334019 Bosworth et al. Oct 2020 A1
20200348809 Drescher Nov 2020 A1
20200349320 Owens Nov 2020 A1
20200356740 Principato Nov 2020 A1
20200356873 Nawrocke et al. Nov 2020 A1
20200374146 Chhabra et al. Nov 2020 A1
20200380212 Butler et al. Dec 2020 A1
20200380449 Choi Dec 2020 A1
20200387664 Kusumura et al. Dec 2020 A1
20200401581 Eubank et al. Dec 2020 A1
20200409949 Saxena et al. Dec 2020 A1
20210014136 Rath Jan 2021 A1
20210019287 Prasad et al. Jan 2021 A1
20210021603 Gibbons Jan 2021 A1
20210034058 Subramanian et al. Feb 2021 A1
20210035069 Parikh Feb 2021 A1
20210042796 Khoury et al. Feb 2021 A1
20210049524 Nachum et al. Feb 2021 A1
20210049555 Shor Feb 2021 A1
20210055955 Yankelevich et al. Feb 2021 A1
20210056509 Lindy Feb 2021 A1
20210072883 Migunova et al. Mar 2021 A1
20210073526 Zeng et al. Mar 2021 A1
20210084120 Fisher et al. Mar 2021 A1
20210124749 Suzuki et al. Apr 2021 A1
20210124872 Lereya Apr 2021 A1
20210136027 Barbitta et al. May 2021 A1
20210149553 Lereya et al. May 2021 A1
20210150489 Haramati et al. May 2021 A1
20210165782 Deshpande et al. Jun 2021 A1
20210166196 Lereya et al. Jun 2021 A1
20210166339 Mann et al. Jun 2021 A1
20210173682 Chakraborti et al. Jun 2021 A1
20210174006 Stokes Jun 2021 A1
20210192126 Gehrmann et al. Jun 2021 A1
20210248311 Helft et al. Aug 2021 A1
20210264220 Wei et al. Aug 2021 A1
20210326519 Lin et al. Oct 2021 A1
20210342785 Mann et al. Nov 2021 A1
20220099454 Decrop et al. Mar 2022 A1
20220121325 Roberts et al. Apr 2022 A1
20220221591 Smith et al. Jul 2022 A1
Foreign Referenced Citations (22)
Number Date Country
2 828 011 Sep 2012 CA
103064833 Apr 2013 CN
107123424 Sep 2017 CN
107422666 Dec 2017 CN
107623596 Jan 2018 CN
107885656 Apr 2018 CN
108717428 Oct 2018 CN
112929172 Jun 2021 CN
3 443 466 Dec 2021 EP
20150100760 Sep 2015 KR
WO 2004100015 Nov 2004 WO
WO 2006116580 Nov 2006 WO
WO 2008109541 Sep 2008 WO
WO 2014088393 Jun 2014 WO
WO 2017202159 Nov 2017 WO
WO 2018023798 Feb 2018 WO
WO 2020187408 Sep 2020 WO
WO 2021096944 May 2021 WO
WO 2021144656 Jul 2021 WO
WO 2021161104 Aug 2021 WO
WO 2021220058 Nov 2021 WO
WO 2022153122 Jul 2022 WO
Non-Patent Literature Citations (95)
Entry
Donath, “Interfaces Make Meaning” chapter from the Social Machine: Designs for Living Online, pp. 41-76, copyright 2014. (Year: 2014).
Alessio et al., Monday.com Walkthrough 2018\All Features, Platforms & Thoughts, Mar. 1, 2018, pp. 1-55, 2018.
Rodrigo et al., Project Management with Monday.com: a 101 Introduction; Jul. 22, 2019, pp. 1-21, 2019.
International Search Report and Written Opinion of the International Searching Authority in PCT/IB2020/000658, dated Nov. 11, 2020 (12 pages).
International Search Report in PCT/IB2020/000974, dated May 3, 2021 (19 pages).
International Search Report in PCT/1B2021/000090 dated Jul. 27, 2021.
ShowMyPC, “Switch Presenter While Using ShowMyPC”; web archive.org; Aug. 20, 2016.
International Search Report and Written Opinion of the International Search Authority in PCT/1B2020/000024, dated May 3, 2021 (13 pages).
“Pivot table—Wikipedia”; URL: https://en.wikepedia .org/w/index.php?title=Pivot_table&oldid=857163289, originally retrieve on Oct. 23, 2019; retrieved on Jul. 16, 2021.
Vishal Singh, “A Theoretical Framework of a BIM-based Multi-Disciplinary Collaboration Platform”, Nov. 5, 2020, Automation in Construction, 20 (2011), pp. 134-144 (Year: 2011).
Edward A. Stohr, Workflow Automation: Overview and Research Issues, 2001, Information Systems Frontiers 3:3, pp. 281-296 (Year: 2001).
International Search Report and Written Opinion of the International Search Authority in PCT/1B2021/000297, dated Oct. 12, 2021 (20 pages).
Dapulse.com “features”.extracted from web.archive.or/web/2014091818421/https://dapulse.com/features; Sep. 2014 (Year: 2014).
Stephen Larson et al., Introducing Data Mining Concepts Using Microsoft Excel's Table Analysis Tools, Oct. 2015, [Retrieved on Nov, 19, 2021], Retrieved from the internet: < URL: https://dl.acm.org/doi/pdf/10.5555/2831373.2831394> 3 Pages (127-129) (Year: 2015).
Isaiah Pinchas etal., Lexical Analysis Tool, May 2004, [Retrieved on Nov. 19, 2021], Retrieved from the internet: <URL: https:// dl.acm.org/doi/pdf/10.1145/997140.997147> 9 Pages (66-74) (Year: 2004).
Sajjad Bahrebar et al., “A Novel Type-2 Fuzzy Logic for Improved Risk Analysis of Proton Exchange Membrane Fuel Cells in Marine Power Systems Application”, Energies, 11, 721, pp. 1-16, Mar. 22, 2018.
Pedersen et al., “Tivoli: an electronic whiteboard for informal workgroup meetings”, Conference on Human Factors in Computing Systems: Proceedings of the INTERACT '93 and CHI '93 conference on Human factors in computing systems; Apr. 24-29, 1993:391-398. (Year 1993).
Kollmann, Franz, “Realizing Fine-Granular Read and Write Rights on Tree Structured Documents.” in the Second International Conference on Availability, Reliability and Security (ARES'07), pp. 517-523. IEEE, 2007. (Year: 2007).
Baarslag, “Negotiation as an Interaction Mechanism for Deciding App Permissions.” In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2012-2019. 2016 (Year: 2016).
Peltier, “Clustered and Stacked col. and Bar Charts”, Aug. 2011, Peltier Technical Services, Inc., pp. 1-128; (Year: 2011).
Beate List, “An Evaluation of Conceptual Business Process Modelling Languages”, 2006, SAC'06, Apr. 23-27, pp. 1532-1539 (Year: 2006).
“Demonstracion en espanol de Monday.com”, published Feb. 20, 2019. https://www.youtube.com/watch?v=z0qydTgof1A (Year: 2019).
Desmedt, Yvo, and Arash Shaghaghi, “Function-Based Access Control (FBAC) From Access Control Matrix to Access Control Tensor.” In Proceedings of the 8th ACM CCS International Workshop on Managing Insider Security Threats, pp. 89-92. (2016).
Anupam, V., et al., “Personalizing the Web Using Site Descriptions”, Proceedings of the Tenth International Workshop on Database and Expert Systems Applications, ISBN: 0-7695-0281-4, DOI: 10.1109/DEXA.1999.795275, Jan. 1, 1999, pp. 732-738. (Year: 1999).
Gutwin, C. et al., “Supporting Informal Collaboration in Shared-Workspace Groupware”, J. Univers. Comput. Sci., 14 (9), 1411-1434 (2008).
Barai, S., et al., “Image Annotation System Using Visual and Textual Features”, In: Proceedings of the 16th International Conference on Distributed Multi-media Systems, pp. 289-296 (2010).
Monday.com et. al. “https://www.youtube.com/watch?v=VpbgWyPf74g” Aug. 9, 2019 (Year: 2019).
B. Ionescu, C. Gadea, B. Solomon, M. Trifan, D. Ionescu and V. Stoicu-Tivadar, “Achat-centric collaborative environment for web-based real-time collaboration,” 2015 IEEE 10th Jubilee International Symposium on Applied Computational Intelligence and Informatics, Timisoara, Romania, 2015, pp. 105-110 (Year: 2015).
Stancu, Florin-Alexandru, Mihai Chiroiu, and Razvan Rughinis. “SecCollab-Improving Confindentiality for Existing Cloud-Based Collaborative Editors.” In 2017 21st International Conferences on Control Systems and Computer Scient (CSCS), pp. 324-331. IEEE,2017. (Year: 2017).
Susanne Hupfer, Li-Te Cheng, Steven Ross, and John Patterson. 2004. Introducing collaboration into an application development environment. In Proceedings of the 2004 ACM conference on Computer supported cooperative work (CSCW '04). Association for Computing Machinery, New York, NY, USA, 21-24 (Year: 2004).
U.S. Appl. No. 17/143,897, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,603, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,745, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,482, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,768, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,677, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,653, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,916, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,475, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,865, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,462, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,470, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,905, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,798, filed Jan. 7, 2021.
U.S. Appl. No. 17/143,892, filed Jan. 7, 2021.
U.S. Appl. No. 17/243,716, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,727, filed Apr. 29, 2021.
U.S. Appl. No. 17/232,978, filed Apr. 16, 2021.
U.S. Appl. No. 17/243,809, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,901, filed Apr. 29, 2021.
U.S. Appl. No. 17/232,354, filed Apr. 16, 2021.
U.S. Appl. No. 17/243,898, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,969, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,742, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,752, filed Apr. 29, 2021.
U.S. Appl. No. 17/232,754, filed Apr. 16, 2021.
U.S. Appl. No. 17/232,827, filed Apr. 16, 2021.
U.S. Appl. No. 17/243,763, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,848, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,934, filed Apr. 29, 2021.
U.S. Appl. No. 17/244,121, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,807, filed Apr. 29, 2021.
U.S. Appl. No. 17/244,027, filed Apr. 29, 2021.
U.S. Appl. No. 17/244,157, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,725, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,737, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,748, filed Apr. 29, 2021.
U.S. Appl. No. 16/453,065, filed Jun. 26, 2019.
U.S. Appl. No. 17/243,691, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,722, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,892, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,977, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,764, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,837, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,729, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,802, filed Apr. 29, 2021.
U.S. Appl. No. 17/242,452, filed Apr. 28, 2021.
U.S. Appl. No. 17/243,891, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,775, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,731, filed Apr. 29, 2021.
U.S. Appl. No. 17/243,768, filed Apr. 29, 2021.
U.S. Appl. No. 16/502,679, filed Jul. 3, 2019.
U.S. Appl. No. 17/565,652, filed Dec. 30, 2021.
U.S. Appl. No. 17/565,699, filed Dec. 30, 2021.
U.S. Appl. No. 17/565,853, filed Dec. 30, 2021.
U.S. Appl. No. 17/565,880, filed Dec. 30, 2021.
U.S. Appl. No. 17/564,745, filed Dec. 29, 2021.
U.S. Appl. No. 17/565,526, filed Dec. 30, 2021.
U.S. Appl. No. 17/565,614, filed Dec. 30, 2021.
U.S. Appl. No. 17/565,718, filed Dec. 30, 2021.
U.S. Appl. No. 17/565,534, filed Dec. 30, 2021.
U.S. Appl. No. 17/565,801, filed Dec. 30, 2021.
U.S. Appl. No. 17/565,821, filed Dec. 30, 2021.
U.S. Appl. No. 17/565,780, filed Dec. 30, 2021.
Using Filters in Overview, published Mar. 7, 2017. https://www.youtube.com/watch?v=hycANhz7gww (Year: 2017).
Related Publications (1)
Number Date Country
20220222427 A1 Jul 2022 US
Provisional Applications (3)
Number Date Country
63273448 Oct 2021 US
63273453 Oct 2021 US
63233925 Aug 2021 US
Continuations (1)
Number Date Country
Parent PCT/IB2021/062440 Dec 2021 US
Child 17565843 US
Continuation in Parts (3)
Number Date Country
Parent PCT/IB2021/000297 Apr 2021 US
Child PCT/IB2021/062440 US
Parent PCT/IB2021/000090 Feb 2021 US
Child PCT/IB2021/000297 US
Parent PCT/IB2021/000024 Jan 2021 US
Child PCT/IB2021/000090 US