Systems and methods to generate records within a collaboration environment based on a machine learning model trained from a text corpus

Information

  • Patent Grant
  • 12118514
  • Patent Number
    12,118,514
  • Date Filed
    Thursday, February 17, 2022
    2 years ago
  • Date Issued
    Tuesday, October 15, 2024
    3 months ago
  • Inventors
  • Original Assignees
  • Examiners
    • Monfeldt; Sarah M
    • Monaghan; Michael J.
    Agents
    • Esplin & Associates, PC
Abstract
Systems and methods to generate records within a collaboration environment are described herein. Exemplary implementations may perform one or more of: manage environment state information maintaining a collaboration environment; effectuate presentation of a user interface through which users upload digital assets representing recorded audio and/or video content; obtain input information defining the digital assets input via the user interface; generate transcription information characterizing the recorded audio and/or video content of the digital assets; provide the transcription information as input into a trained machine-learning model; obtain the output from the trained machine-learning model, the output defining one or more new records based on the transcripts; and/or other operations.
Description
FIELD OF THE DISCLOSURE

The present disclosure relates to systems and methods to generate records within a collaboration environment, in particular using a machine learning model trained from a text corpus that generates records from asynchronously recorded audio and/or video content.


BACKGROUND

Web-based collaboration environments, sometimes referred to as work management platforms, may enable users to assign projects, tasks, or other assignments to assignees (e.g., other users) to complete. A collaboration environment may comprise an environment in which individual users and/or a virtual team of users does its work and enables the users to work in a more organized and efficient manner when remotely located from each other.


SUMMARY

Hosting a web-based collaboration environment poses many challenges. For example, operating the collaboration environment may require precise ways of creation, storage, management, and/or provision of information that makes up the collaboration environment. One way that operators look to improve the operation of the collaboration environment is to improve parts of the collaboration environment involving substantial human-machine interaction. For example, users may traditionally manually generate work unit records for units of work within the collaboration environment. The operators of the collaboration environment were traditionally tasked with finding ways to design and configure user interfaces which would provide user-friendly and intuitive ways to receive this manual input. However, even with improved user interfaces that walk through manual generation of work unit records, the requirement for human-machine interactions is time consuming, may cause decreased workflow efficiency, and/or may be prone to user error. The inventors of the present disclosure have also identified that work unit records are often created by the users from some reference material. For example, after a recorded video or audio meeting or dictation, a user may generate one or more work unit records that reflect the work to be done following the recording. This translation from one format (recorded audio/video) to manually providing precise definitions of information that makes up the collaboration environment further compounds these existing problems.


To address these and/or other problems, one or more implementations presented herein propose a technique to automatically generate records from a recording of audio and/or video. The audio and/or video may have been recorded asynchronously with respect to the creation of one or more records. The recorded audio and/or video may be referred to as “asynchronous audio and/or video.” The records may be automatically generated from digital assets the user uploads which represent the asynchronous audio and/or video. By way of non-limiting illustration, a user may upload a digital asset (e.g., video files, audio file, and/or other assets) into a user interface. The system may carry out one or more processing techniques to extract the content from the digital assets, and structure the content into a format that facilitates creation of a record, such as a work unit record. By way of non-limiting illustration, the content may be parsed to identify values of parameters that make up a work unit record. In some implementations, when a work unit record is generated, one or more fields may be automatically filled based on context surrounding the uploaded asset(s).


In some implementations, a text corpus may be utilized as training data for a machine-learning model which performs the extraction and/or structuring of the content. In some implementations, the text corpus may comprise text that makes up one or more existing work unit records (and/or other records) present in the collaboration environment. These along with other features and/or functionality presented herein, may be recognized by persons of ordinary skill in the art as providing improvements upon the operation of a collaboration environment including, among others, increased efficiency and accuracy in the creation and management of the information making up records of the collaboration environment.


One or more implementations of a system to generate records within a collaboration environment may include one or more hardware processors configured by machine-readable instructions and/or other components. Executing the machine-readable instructions may cause the one or more hardware processors to facilitate generating records within a collaboration environment. The machine-readable instructions may include one or more computer program components. The one or more computer program components may include one or more of an environment state component, a user interface component, a content component, a work creation component, and/or other components.


The environment state component may be configured to manage environment state information maintaining a collaboration environment. The collaboration environment may be configured to facilitate interaction by users with the collaboration environment. The environment state information may include one or more records. The one or more records may include work unit records, project records, objective records, and/or other records. The work unit records may include work information and/or other information. The work information may characterize units of work created, managed, and/or assigned to within the collaboration environment to the users who are expected to accomplish one or more actions to complete the units of work. The work information may comprise values of work unit parameters characterizing the units of work.


The user interface component may be configured to effectuate presentation of a user interface through which users upload digital assets representing recorded content, e.g., video and/or audio content. The video and/or audio content may include human utterances and/or other content. The user interface may include one or more portions. The one or more portions may include an input portion. The input portion may be configured to receive user input of individual ones of the digital assets.


The user interface component may be configured to obtain input information defining the digital assets input via the user interface. The input information may define a first digital asset input into the user interface via the input portion. The first digital asset includes first recorded audio and/or video content including a first set of utterances.


In some implementations, the content component may be configured to, in response to obtaining the input information, generate transcription information and/or other information. The transcription information may characterize the audio and/or video content. The transcription information may include transcripts and/or other information. The transcripts may include text strings characterizing one or more of the utterances, physical gestures, facial expressions, physical movement, and/or other content. By way of non-limiting illustration, first transcription information may be generated from the first recorded audio and/or video content of the first digital asset. The first transcription information may include a first transcript comprising a first set of text strings characterizing the first set of utterances and/or other content.


The work creation component may be configured to generate individual records. The work creation component may be configured to provide the transcription information as input into a trained machine-learning model. The trained machine-learning model may be configured to provide output based on input(s). The output may include new ones of the records. By way of non-limiting illustration, the input into the trained machine-learning model may include the first transcription information.


The work creation component may be configured to obtain the output from the trained machine-learning model. The output may include definitions of the new ones of the records based on the transcripts and/or other content. The output may define the new ones of the records by including one or more values of one or more parameters of the new ones of the records. By way of non-limiting illustration, the output may include a first work unit record based on the input of the first transcription information.


As used herein, any association (or relation, or reflection, or indication, or correspondency) involving servers, processors, client computing platforms, and/or another entity or object that interacts with any part of the system and/or plays a part in the operation of the system, may be a one-to-one association, a one-to-many association, a many-to-one association, and/or a many-to-many association or N-to-M association (note that N and M may be different numbers greater than 1).


These and other features, and characteristics of the present technology, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a system configured to generate records within a collaboration environment, in accordance with one or more implementations.



FIG. 2 illustrates a method to generate records within a collaboration environment, in accordance with one or more implementations.



FIG. 3 illustrates a user interface, in accordance with one or more implementations.



FIG. 4 illustrates a user interface, in accordance with one or more implementations.



FIG. 5 illustrates a user interface, in accordance with one or more implementations.





DETAILED DESCRIPTION


FIG. 1 illustrates a system 100 configured to generate records within a collaboration environment, in accordance with one or more implementations. Often, records may be manually created by users from some reference material. For example, after a recorded video or audio meeting or dictation, a user may generate one or more records that reflect real work to be done following the recording. Manually generating records may be time consuming, may decrease workflow efficiency, and the detailed requirements for creation of the records in the collaboration environment may be affected by user error.


To address these and/or other problems, one or more implementations presented herein propose a technique to automatically generate records from a recording of audio and/or video. The audio and/or video may have been recorded asynchronously with respect to the creation of one or more work unit records (and may be referred to as “asynchronous audio and/or video”). In some implementations, the system 100 may utilize a text corpus as training data for a machine-learning model to perform extraction of content from the recordings and/or structuring of the content into a structure that facilitates the generation of records. In some implementations, the text corpus may comprise text that makes up one or more existing work unit records (and/or other records) present in the collaboration environment.


In some implementations, system 100 may include one or more of one or more servers 102, one or more client computing platforms 104, external resource(s) 126, and/or other components. Server(s) 102 may be configured to communicate with one or more client computing platforms 104, one or more external resources 126, and/or other entities of system 100 according to a client/server architecture and/or other architectures. Client computing platform(s) 104 may be configured to communicate with other client computing platforms via server(s) 102 and/or according to a peer-to-peer architecture and/or other architectures. Users may access system 100 and/or instances of the collaboration environment via client computing platform(s) 104. Server(s) 102 may be remote from client computing platform(s) 104. Client computing platform(s) 104 may be remote from each other.


Server(s) 102 may include one or more of non-transitory electronic storage 128, one or more processors 130 configured by machine-readable instructions 106, and/or other components. The non-transitory electronic storage 128 may store one or more records and/or other information. Machine-readable instructions 106 may include one or more instruction components. The instruction components may include computer program components. Executing the machine-readable instructions 106 may cause server(s) 102 to facilitate generating records within a collaboration environment. The computer program components may include one or more of an environment state component 108, user interface component 110, a content component 112, work creation component 114, and/or other components.


Environment state component 108 may be configured to manage environment state information and/or other information used in maintaining a collaboration environment. The collaboration environment may be configured to facilitate interaction by users with the collaboration environment. The environment state information may include one or more records. The one or more records may include one or more of user records, work unit records, project records, objective records, and/or other records. The user records may include user information describing the users of the collaboration environment. The work unit records which may include work information describing units of work assigned to, created by, and/or managed by the users within the collaboration environment. The project records may include project information describing projects created, assigned, and/or managed within the collaboration environment. An individual project may include individual sets of the units of work supporting the individual projects. The objective records may include objective information describing business objectives specified within the collaboration environment.


The user information in the user records may include values of user parameters and/or other information. The values of the user parameters may be organized in the user records corresponding to users interacting with and/or viewing the collaboration environment. The values of the user parameters may include information describing the users, their actions within the collaboration environment, their settings, and/or other user information; and/or metadata associated with the users, their actions within the environment, their settings, and/or other user information. Individual ones of the users may be associated with individual ones of the user records. A user record may define values of the user parameters associated with a given user.


The values of the user parameters may, by way of non-limiting example, specify one or more of: a user name, a group, a user account, user role information, a user department, descriptive user content, a to-email, a from-email, a photo, an organization, a workspace, one or more user comments, one or more teams the user belongs to, one or more of the user display settings (e.g., colors, size, project order, task order, other unit of work order, etc.), one or more authorized applications, one or more interaction parameters (e.g., indicating a user is working on/worked on a given unit of work, a given user viewed a given work unit of work, a given user selected a given unit of work, a timeframe a given user last interacted with and/or worked on a given unit of work, a time period that a given unit of work has been idle, and/or other interaction parameters), one or more notification settings, one or more progress parameters, status information for one or more work units the user is associated with (units of work assigned to the user, assigned to other users by the user, completed by the user, past-due date, and/or other information), one or more performance/productivity metrics of a given user (e.g., how many units of work the user has completed, how quickly the user completed the units of work, how quickly the user completes certain types of work units, the efficiency of the user, bandwidth of the user, activity level of the user, how many business objectives the user has helped fulfill through their completion of units of work, etc.), application access information (e.g., username/password for one or more third-party applications), one or more favorites and/or priorities, schedule information, and/or other information.


Schedule information for the individual users may include one or more calendar entries associated with the individual users. The individual calendar entries may be associated with individual start dates and individual end dates. In some implementations, schedule information may be stored locally within electronic storage 128 by virtue of features and/or functionality provided within a collaboration environment. By way of non-limiting illustration, a collaboration environment may have features and/or functionality of a calendar application configured to facilitate calendaring entries into a schedule. It is noted that schedule information may be determined through features and/or functionality provided by one or more external resources 126. By way of non-limiting illustration, and external resource may include a calendar application which may be external to a collaboration environment. The collaboration environment may have permissions to access the external calendar application to determine and/or obtain schedule information.


The work information in the work unit records may include values of one or more work unit parameters. The values of the work unit parameters may be organized in work unit records corresponding to units of work managed, created, and/or assigned within the collaboration environment. A given unit of work may have one or more assignees and/or collaborators working on the given work unit. Units of work may include one or more to-do items, action items, objectives, and/or other units of work one or more users should accomplish and/or plan on accomplishing. Units of work may be created by a given user for the given user and/or created by the given user and assigned to one or more other users. Individual units of work may include one or more of an individual task, an individual sub-task, and/or other units of work assigned to and/or associated with one or more users. Individual units of work may include one or more digital content items. An individual unit of work may include an individual digital content item by virtue of the individual digital content item (and/or a copy or instance thereof) being attached and/or appended thereto. A digital content item may include one or more of an image, a video, an audio file, a PDF, a word document, and/or other digital content items.


In some implementations, units of work created by, assigned to, and/or completed by the users may refer generally to a linking of the units of work with the individual users in the collaboration environment. A unit of work may be linked with a user in a manner that defines one or more relationships between the user and the unit of work. Such a relationship may connote and/or be a result of an action (past, present, and/or future) of the user with respect to the unit of work. Such actions may include one or more of creating a work unit record for a unit of work, being assigned to participate in a unit of work, participating in a unit of work, being granted access to a work unit record of a unit of work, adjusting a value of a work unit parameter of a work unit record of a unit of work, being assigned a role at the unit of work level, and/or other actions.


Individual sets of work unit records may be defined by a record hierarchy. A record hierarchy may convey individual positions of work unit records (and their corresponding units of work) in the record hierarchy. By way of non-limiting illustration, a position may specify one or more of a work unit record being superior to another work unit record, a work unit record being subordinate to another work unit record, and/or other information. As a result, individual work unit records in the individual sets of work unit records may be subordinate to other individual work unit records in the individual sets of work unit records. For example, a work unit record may define a unit of work comprising a task, and a subordinate work unit record may define a unit of work comprising a sub-task to the task. A record hierarchy may define a relationship between work unit records. A work unit record may have some restrictions placed on it by virtue of having a subordinate work unit record. By way of non-limiting illustration, a work unit record may be restricted from access (or restricted from marking complete) by one or more users unless and/or until a subordinate work unit record is completed and/or started.


Individual work unit records may include hierarchical information defining a record hierarchy of the individual work unit records. The hierarchical information of a work unit record may include one or more of information identifying other work unit records associated in a record hierarchy the work unit record belongs to, a specification of the position of the work unit record in the hierarchy, restrictions and/or other relationships placed on the work unit record by virtue of its position, and/or other information.


In some implementations, the one or more work unit parameters may include one or more of a work assignment parameter, work completion parameter, a work management parameter, work creation parameter, dependency parameter, and/or other parameters. The values of the work assignment parameter may describe assignees of individual units of work. The values of the work management parameter may describe users who manage individual units of work and/or the extent in which they manage. The values of the work creation parameter may describe creation characteristics of individual units of work. The creation characteristics may include who created the work unit record, when it was created, and/or other information.


In some implementations, values of a dependency parameter may describe whether a given unit of work is dependent on one or more other units of work. A unit of work being dependent on an other unit of work may mean the unit of work may not be completed, started, assigned, and/or have other interactions performed in relation to the unit of work before some action is performed on the other unit of work. By way of non-limiting illustration, a unit of work may not be started until another unit of work is completed, meaning the unit of work may be dependent on the other unit of work. In some implementations, values of the dependency parameter may go hand in hand with the hierarchical information. By way of non-limiting illustration, a unit of work that is subordinate to an other unit of work may be dependent on the other unit of work, or vice versa.


In some implementations, values of work unit parameters may include one or more of a unit of work name, a unit of work description, user role information, one or more unit of work dates (e.g., a start date, a due date or end date, a completion date, and/or dates), project inclusion (e.g., identification of projects supported by the individual units of work), one or more members associated with a unit of work (e.g., an owner, one or more collaborators, collaborator access information, and/or other unit of work collaborators and/or collaborator information), completion state, one or more user comment parameters (e.g., permission for who may make comments such as an assignee, an assignor, a recipient, one or more followers, and/or one or more other interested parties; content of the comments; one or more times; presence or absence of the functionality of up-votes; one or more hard-coded responses; and/or other parameters), one or more interaction parameters (e.g., indicating a given unit of work is being worked on/was worked on, a given work unit of work was viewed, a given unit of work was selected, how long the given unit of work has been idle, a last interaction parameter indicating when and what user last interacted with the given unit of work, users that interacted with the given unit of work, quantity and/or content of comments on the unit of work, and/or other interaction parameters indicating sources of the interactions, context of the interactions, content of the interactions and/or time for the interactions), one or more digital content item attachments, notification settings, privacy, an associated URL, one or more interaction parameters (e.g., sources of the interactions, context of the interactions, content of the interactions, time for the interactions, and/or other interaction parameters), updates, state of a workspace for a given unit of work (e.g., application state parameters, application status, application interactions, user information, and/or other parameters related to the state of the workspace for a unit of work), one or more performance/productivity metrics for a given unit of work, hierarchical information, dependency, one or more custom fields (e.g., priority, cost, stage, and/or other custom fields), and/or other information.


The values of the work assignment parameter describing assignment of users to units of work may be determined based on one or more interactions by one or more users with a collaboration environment. In some implementations, one or more users may create and/or assign one or more unit of work to themselves and/or another user. In some implementations, a user may be assigned a unit of work and the user may effectuate a reassignment of the unit of work from the user or one or more other users.


In some implementations, values of the work completion parameter may indicate that a completion status of a unit of work has changed from “incomplete” to “marked complete” and/or “complete”. In some implementations, a status of complete for a unit of work may be associated with the passing of an end date associated with the unit of work. In some implementations, a status of “marked complete” may be associated with a user providing input via the collaboration environment at the point in time the user completes the unit of work (which may be before or after an end date).


In some implementations, managing the environment state component 108 may include maintaining queues of the units of work assigned to the users. The queues may be presented to the users in a user interface of the collaboration environment to facilitate access to the units of work via work unit pages. Individual queues may represent the units of work assigned to individual users organized in an order based on the individual end dates and/or other dates (e.g., start dates) and/or other ordering. Individual queues may be presented in a user interface based on one or more of a list view, a calendar view, and/or other views. The calendar view may be a calendar view by week, by more than one week (e.g., 1st through 15th), by month, by more than one month (e.g., May through July), and/or other calendar views. Units of work may be represented in a calendar view by user interface elements (e.g., icons, calendar entries, etc.).


Project information in project records may define values of project parameters for projects managed within the collaboration environment. The project parameters may characterize one or more projects created, assigned, and/or managed within the collaboration environment and/or via the collaboration environment, and/or the metadata associated with the one or more projects. Individual ones of the projects may be associated with individual ones of the project records. The project information may define values of the project parameters associated with a given project managed within the collaboration environment and/or via the collaboration environment. A given project may have one or more owners and/or one or more collaborators working on the given project. The given project may include one or more units of work assigned to one or more users under the given project heading. In some implementations, projects may include one or more units of work that may directly facilitate progress toward fulfillment of the projects. Accordingly, completion of units of work may directly contribute to progress toward fulfillment of the project. By way of non-limiting illustration, an individual project may be associated with a client and the units of work under the individual project heading may be work directly contributing to the fulfillment of a business relationship with the client.


The values of the project parameters may, by way of non-limiting example, include one or more of: one or more units of work within individual ones of the projects (which may include values of work unit parameters defined by one or more work unit records), status information, user role information, one or more user comment parameters (e.g., a creator, a recipient, one or more followers, one or more other interested parties, content, one or more times, upvotes, other hard-coded responses, etc.), a project name, a project description, one or more project dates (e.g., a start date, a due date, a completion date, and/or other project dates), one or more project collaborators (e.g., an owner, one or more other project collaborators, collaborator access information, and/or other project collaborators and/or collaborator information), one or more attachments, notification settings, privacy, an associated URL, one or more interaction parameters (e.g., sources of the interactions, context of the interactions, content of the interactions, time for the interactions, and/or other interaction parameters), updates, ordering of units of work within the given project, state of a workspace for a given task within the given project, and/or other information.


In some implementations, projects created by, assigned to, and/or completed by the users may refer generally to a linking of the projects with the individual users in the collaboration environment. A project may be linked with a user in a manner that defines one or more relationships between the user and the project. Such a relationship may connote and/or be a result of an action (past, present, and/or future) of the user with respect to the project. Such actions may include one or more of creating a project record for a project, being assigned to participate in a project, participating in a project, being granted access to a project record of a project, adjusting a value of a project parameter of a project record of a project, being assigned a project-level role, and/or other actions.


User role information may specify individual roles of the individual users. A role may represent a position of an individual user. The position may be specified based on a description of one or more of a job title, level, stage, and/or other descriptions of position. The role may be specified with respect to a business organization as a whole and/or other specifications. By way of non-limiting illustration, a role may include one or more of chief executive officer (or other officer), owner, manager, supervisor, accountant, associate, employee, intern, entry level, midlevel, senior, administrator, director, foreman, engineer, product developer, human resource officer, artist, art director, and/or other descriptions.


In some implementations, user role information may specify roles of the users within the units of work and/or the projects. The roles may convey expected contribution of the users in completing and/or supporting the units of work and/or the projects. The individual roles of individual users within the units of work may be specified separately from the individual roles of the individual users within the projects. The individual roles of individual users within the units of work and/or projects may be specified separately from the individual roles of the individual users within a business organization as a whole.


The objective information in objective records may include values of one or more objective parameters. The values of the objective parameters may be organized in objective records corresponding to business objectives managed, created, and/or owned within the collaboration environment. A given business objective may have one or more collaborators, and/or team members working on the given business objective. Business objectives may include one or more associated units of work and/or projects one or more users should accomplish and/or plan on accomplishing. Business objectives may be created by a given user for the given user and/or created by the given user and assigned to be owned to one or more other users. Individual business objectives may include one or more of an individual goal, an individual sub-goal, and/or other business objectives assigned to be owned by a user and/or associated with one or more users.


The business objectives may be associated with a set of units of work and/or projects that may indirectly facilitate progress toward fulfillment of the business objectives. The set of units of work and/or projects may not directly contribute to the progress. By way of non-limiting illustration, a connection between the set of units of work and/or projects and a corresponding business objective may be indirect in that completion of at least one of the units of work and/or projects may have no direct impact on progress toward fulfillment of the business objective. The concept of “no direct impact” may mean that completion of the at least one unit of work and/or project may not cause progress toward fulfillment of the business objective without independent action outside of the at least one unit of work and/or project. Instead, the fulfillment of the at least one unit of work and/or project may make such independent action more likely (e.g., through coercion, assistance, education, incentivization, reminder, etc.). However, in some implementations, business objectives may be associated with a set of units of work and/or projects that may directly facilitate progress toward fulfillment of the business objectives. Accordingly, completion of the set of units of work and/or projects may directly contribute to the progress toward fulfillment. Business objectives may be associated with an objectives and key result (OKR) goal-setting framework. Business objectives may be specified on one or more of a team basis, organization basis, and/or other specifications. In some implementations, business objectives may be characterized as user objectives. The user objectives may be associated with a set of units of work and/or projects that may indirectly (and/or directly) facilitate progress toward fulfillment of the user objectives. User objectives may be specified on an individual user basis.


Individual objective records may describe individual business objectives and/or identify sets of work unit records and/or project records that support the individual business objectives.


Individual sets of objective records may be defined by an objective record hierarchy. An objective record hierarchy may convey individual positions of objective records (and their corresponding business objectives) in the objective record hierarchy. By way of non-limiting illustration, a position may specify one or more of an objective record being superior to one or more other objective records, an objective record being subordinate to one or more other objective records, and/or other information. As a result, individual objective records may be subordinate and/or superior to other individual objective records. For example, the objective records may further include a second objective record. The first objective record and the second objective record may be organized by a first objective record hierarchy specifying that the second objective record is subordinate to the first objective record.


An objective record may define a business objective comprising a progress towards fulfillment, and a subordinate objective record may define a business objective comprising a subordinate progress towards fulfillment to the subordinate business objective. An objective record hierarchy may define a relationship between objective records.


Individual objective records may include hierarchical information defining an objective record hierarchy of the individual objective records. The hierarchical information of an objective record may include one or more of information identifying other objective records associated in an objective record hierarchy the objective record belongs to, a specification of the position of the objective record in the hierarchy, other relationships placed on the objective record by virtue of its position, and/or other information.


In some implementations, as a consequence of the objective record hierarchies, the individual business objectives described in the individual objective records that are subordinate to the other individual objective records may be subordinate to the individual business objectives in the other individual objective records.


In some implementations, the one or more objective parameters may include one or more of an objective definition parameter, an objective owner parameter, an objective management parameter, an objective creation parameter, an objective progress parameter, and/or other parameters. The value of the objective definition parameter may describe the particular business objective. The values of the objective owner parameter may describe business objectives assigned to be owned by an individual user. The values of the objective management parameter may describe business objectives managed as collaborators by the individual users. The values of the objective creation parameter may describe business objectives created by the individual users.


In some implementations, the business objectives may be described based on one or more of a business objective name, a business objective description, one or more business objective dates (e.g., a start date, a due date, and/or dates), one or more members associated with a business objective (e.g., an owner, one or more other project/task members, member access information, and/or other business objective members and/or member information), progress information (e.g., an update, a hardcoded status update, a measured status, a progress indicator, quantity value remaining for a given business objective, completed work units in a given project, and/or other progress information), one or more interaction parameters, notification settings, privacy, an associated URL, one or more custom fields (e.g., priority, cost, stage, and/or other custom fields), and/or other information.


The values of the objective owner parameter describing business objectives owned by the individual users may be determined based on one or more interactions by one or more users with a collaboration environment. In some implementations, one or more users may create and/or assign ownership of one or more business objectives to themselves and/or another user. In some implementations, a user may be assigned to own a business objective and the user may effectuate a reassignment of ownership of the business objective from the user or one or more other users.


Progress information for the individual business objectives may convey progress toward fulfillment of the individual business objectives. The progress information for the individual business objectives may convey progress toward fulfillment of the individual business objectives. In some implementations, the progress toward fulfillment of the business objectives may be specified as one or more of a quantitative value, a qualitative value, and/or other information. In some implementations, the quantitative value may be a percentage of completion, an integer value, a dollar amount, and/or other values. In some implementations, progress toward fulfillment of the individual business objectives may be determined independently from incremental completion of the units of work in the individual sets of units of work associated with the individual business objectives. The completion of the units of work associated with a given business objective may not directly progress the given business objective toward fulfillment, but completing the units of work may make accomplishing the business objective more likely (e.g., through coercion, assistance, education, incentivization, reminder, etc.). However, in some implementations, progress toward fulfillment of the individual business objectives may be directly determined based on incremental completion of the units of work in the individual sets of units of work associated with the individual business objectives.


It is noted that metadata and/or values of parameters related to any users, projects, business objectives, and/or units of work may be considered values of user parameters, project parameters, objective parameters, and/or work unit parameters.


In some implementations, environment state component 108 may be configured to manage information defining work unit pages corresponding to the individual units of work. Individual work unit pages may provide access to individual units of work. Managing information defining work unit pages may include determining, obtaining, and/or modifying information used to generate work unit pages. Managing information defining individual work unit pages may include providing information to the user interface component 110 to effectuate presentation of the work unit pages, and/or other information. In some implementations, individual work unit pages may include individual sets of interface elements displaying the values of one or more of the work unit parameters of the individual units of work.


In some implementations, environment state component 108 may be configured to manage information defining project pages corresponding to the individual projects. Individual project pages may provide access to individual projects. Managing information defining project pages may include determining, obtaining, and/or modifying information used to generate project pages. Managing information defining individual project pages may include providing information to the user interface component 110 to effectuate presentation of the project pages, and/or other information. In some implementations, individual project pages may include individual sets of interface elements displaying the values of one or more of the project parameters of the individual projects.


In some implementations, environment state component 108 may be configured to manage information defining business objective pages corresponding to the individual business objectives. Individual business objective pages may provide access to individual business objectives. Managing information defining business objective pages may include determining, obtaining, and/or modifying information used to generate business objective pages. Managing information defining individual business objective pages may include providing information to the user interface component 110 to effectuate presentation of the business objective pages, and/or other information. In some implementations, individual business objective pages may include individual sets of interface elements displaying the values of one or more of the objective parameters of the individual business objectives.


The user interface component 110 may be configured to effectuate presentation of instances of a user interface of the collaboration environment. The user interface may provide one or more views of the collaboration environment and/or provide other features and/or functionality. The one or more views may include one or more pages of the collaboration environment. In some implementations, an individual view of the collaboration environment may textually and/or graphically display information from one or more of a user record, a project record, an objective record, and/or other records. By way of non-limiting illustration, a view may display one or more of a work unit page, a project page, a business objective page, a queue of units of work, and/or other information.


The user interface component 110 may be configured to effectuate presentation of a user interface through which users input and/or upload digital assets representing sets of content. The individual sets of content may include one or more of recorded video content, recorded audio content, and/or other content.


In some implementations, digital assets that users may input may include one or more of video files, audio files, and/or other digital assets representing sets of content.


In some implementations, content of audio files may include recorded audio content and/or other content. The recorded audio content may include utterances of users and/or other sounds. User utterances may include speech, emotes, and/or other vocalizations. An individual utterance may include one or more of a spoken word, a statement, a vocal sound, and/or other considerations. Other sounds may include environmental noise and/or other sounds.


In some implementations, content of video files may include one or more of recorded video content, recorded audio content, and/or other content. In some implementations, the recorded video content may include visual content. The visual content may include one or more of individual images comprising frames of a video, sets of images comprising a series of frames, and/or other content. The video content may depict one or more of real-world users, real-world environment, digital content, and/or other content. In some implementations, video files may include time stamps associated with the recorded visual content and recorded audio content. The time stamps may provide a synchronization between the recorded video content and recorded audio content.


In some implementations, the sets of content represented in the digital assets may generally be unstructured content. “Unstructured” may mean that the content represented in the digital assets may be unusable by the collaboration environment unless and/or until it is structured in one form or another. The user interface may include one or more portions. A portion may include an input portion configured to receive user input of the digital assets. In some implementations, the input portion may be configured to receive user input through drag-and-drop input, file selection through a search and/or drop-down menu, and/or other input.


The user interface component 110 may obtain input information conveying user input into a user interface. In some implementations, the input information may define individual ones of the digital assets input by the users via the user interface. By way of non-limiting illustration, the input information may include the data included in uploaded files. By way of non-limiting illustration, the input information may define a first digital asset input into the user interface via the input portion by a first user and/or other users. The first digital asset may represent first recorded audio content and/or other content. The first recorded audio content may include a first set of utterances. The first set of utterances may include utterances by one or more users.


In some implementations, the content component 112 may be configured to, in response to obtaining the input information, generate transcription information and/or other information. The transcription information may characterize the audio and/or video content. The transcription information may include transcripts and/or other information. The transcripts may include text strings characterizing one or more of the utterances by users, physical gestures by users, facial expressions by users, physical movement by users, and/or other content. The transcripts may include text strings comprising one or more of the utterances by users, descriptions of the physical gestures by users, descriptions of the facial expressions by users, descriptions of the physical movement by users, and/or other content. By way of non-limiting illustration, first transcription information may be generated from the first recorded audio content and/or other content of the first digital asset. The first transcription information may include a first transcript comprising a first set of text strings characterizing the first set of utterances and/or other content.


In some implementations, generating transcription information from recorded audio may include performing speech processing on the recorded audio content and/or other processing. In some implementations, the speech processing may identify users, characterize user utterances in the audio content, and/or produce other output. By way of non-limiting illustration, the speech processing may determine user utterances (e.g., words and/or phrases spoken), identify users who are speaking, and/or provide other information. In some implementations, generating transcription information from recorded audio may include performing semantic natural language processing to determine the meaning of the utterances. In some implementations, the meaning of the utterances may be included in the transcripts. By way of non-limiting illustration, the meaning may be included with one or more of in-line descriptions, footnotes, as metadata accompanying the transcripts, and/or other considerations.


In some implementations, generating transcription information from recorded video content may include performing one or more of feature detection on the visual content to characterize features in the visual content, speech processing on audio content to characterize user utterances in the audio content, and/or other processing techniques. In some implementations, feature detection may characterize features in the visual content. By way of non-limiting illustration, techniques for feature detection may include one or more of computer vision, Speeded Up Robust Features (SURF), Scale-Invariant Feature Transform (SIFT), Oriented FAST and rotated BRIEF (ORB), Optical Character Recognition (OCR), facial recognition, and/or other techniques. In some implementations, the features and/or characterizations thereof may be included in the transcripts. By way of non-limiting illustration, descriptions of the features and/or characterizations thereof may be included with one or more of in-line descriptions, footnotes, as metadata accompanying the transcripts, and/or other considerations.


Content component 112 may be configured to generate transcription information in response to specific user input requesting the generation of the transcription information following the input of the digital assets. By way of non-limiting illustration, responsive to obtaining a request to generate transcription following the input of the first digital asset, content component 112 may be configured to generate the first transcription information.


In some implementations, the transcription information may include context information specifying context of the digital assets. The context may include one or more of an uploader of a digital asset (e.g., user who input the digital asset), a creator of a digital asset (e.g., a user who recorded the audio and/or video content), a page of the collaboration environment from which the user accessed the user interface to input of the digital assets, a time and/or date associated with the input of a digital asset, and/or other information.


The content component 112 may be configured to determine context information. By way of non-limiting illustration, the content component 112 may be configured to identify one or more of the users linked to a digital asset. The one or more users linked to the digital asset may include one or more of the creator of the digital asset, the uploader of the digital asset, and/or other users. The content component 112 may be configured to identify a time and/or date a digital asset was uploaded.


The content component 112 may be configured to obtain the user records for the users linked to the digital asset. In some implementations, an uploader of a digital asset may be identified. The user record of the uploader may be accessed and/or obtained to gather further information about the uploader. In some implementations, a creator of a digital asset may be identified. The creator may be identified based on identifying information included in the content of the digital asset and/or metadata associated with the digital asset. The user record of the creator may be accessed and/or obtained to gather further information about the creator.


In some implementations, a user may access the user interface to input a digital asset by navigating to the user interface from a page of the collaboration environment. The page may be associated with a record, referred to as a “source record” for the page. The source record may include one or more of a user record, a work unit record, a project record, an objective record, and/or other records.


The work creation component 114 may be configured to generate individual records based on transcription information and/or other information. In some implementations, an individual record may be generated by determining information describing the individual record as part of the environment state information. Determining work information may include specifying values for one or more parameters.


By way of non-limiting illustration, an individual work unit record may be generated by determining work information describing the individual work unit record as part of the environment state information. Determining work information may include specifying values for one or more work unit parameters. By way of non-limiting illustration, an individual project record may be generated by determining project information describing the individual project record as part of the environment state information. Determining project information may include specifying values for one or more project parameters. By way of non-limiting illustration, an individual objective record may be generated by determining objective information describing the individual objective record as part of the environment state information. Determining objective information may include specifying values for one or more objective parameters. It is noted that while one or more implementations described herein may refer to generation of work unit records, this is for illustrative purposes only and not to be considered limiting. Instead, those skilled in the art will appreciate that other types of records may be generated, including but not limited to project records, user records, objective records and/or other records.


A generated work unit record may include values determined based on context information of the transcription information, and/or other information. By way of non-limiting illustration, a work unit record may include one or more values determined on an identity of the uploaded. By way of non-limiting illustration, the uploader may be specified as one or more of an assignee of the work unit record, an assignor of the work unit record, a collaborator of the work unit record, a reviewer of the work unit record, and/or used to specify other values. In some implementations, a user linked to, and/or associated with, the uploader may be used to specify one or more values. By way of non-limiting illustration, a user having a subordinate role with respect to the uploader may be an assignee of the work unit record, a user having a superior role with respect to the uploader may be a reviewer of the work unit record, and/or other values may be specified.


In some implementations, a generated work unit record may include values determined based on an identity of a creator. By way of non-limiting illustration, the creator may be specified as one or more of an assignee of the work unit record, an assignor of the work unit record, a collaborator of the work unit record, a reviewer of the work unit record, and/or used to specify other values. In some implementations, a user linked to, and/or associated with, the creator may be used to specify one or more values. By way of non-limiting illustration, a user having a subordinate role with respect to the creator may be an assignee of the work unit record, a user having a superior role with respect to the creator may be a reviewer of the work unit record, and/or other values may be specified.


In some implementations, a generated work unit record may be included in a source record identified by the context information. In some implementations, a generated work unit record may be specified as being subordinate to the source record. In some implementations, a generated work unit record may be specified as being superior to the source record. In some implementations, a generated work unit record may be specified as being dependent on, or depending from, the source record. In some implementations, a generated work unit record may have one or more dates (e.g., due date, review date, and/or other dates) that are specified relative to one or more dates of the source record. By way of non-limiting illustration, the generated work unit record may have a due date that is specified as occurring a specified amount of days before and/or after a due date of the source record.


In some implementations, a generated work unit record may include values determined based on a time and/or date a digital asset was uploaded. By way of non-limiting illustration, a due date of the work unit record may be determined relative the time and/or date of the upload. By way of non-limiting illustration, a due date may be specified as being given period of time following the time and/or date of upload.


In some implementations, work creation component 114 may be configured to train and/or utilize a machine learning model to generate individual records based on transcription information and/or other information. In some implementations, the machine learning model may utilize one or more of an artificial neural network, naïve bayes classifier algorithm, k means clustering algorithm, support vector machine algorithm, linear regression, logistic regression, decision trees, random forest, nearest neighbors, and/or other approaches. The work creation component 114 may utilize training techniques such as deep learning. The work creation component 114 may utilize training techniques such as one or more of supervised learning, semi-supervised learning, unsupervised learning, reinforcement learning, and/or other techniques.


In supervised learning, the model may be provided with known training dataset that includes desired inputs and outputs, and the model may be configured to find a method to determine how to arrive at those outputs based on the inputs. The model may identify patterns in data, learn from observations, and make predictions. The model may make predictions and may be corrected or validated by an operator—this process may continue until the model achieves a high level of accuracy/performance. Supervised learning may utilize approaches including one or more of classification, regression, and/or forecasting.


Semi-supervised learning may be similar to supervised learning, but instead uses both labelled and unlabeled data. Labelled data may comprise information that has meaningful tags so that the model can understand the data, while unlabeled data may lack that information. By using this combination, the machine learning model may learn to label unlabeled data.


For unsupervised learning, the machine learning model may study data to identify patterns. There may be no answer key or human operator to provide instruction. Instead, the model may determine the correlations and relationships by analyzing available data (see, e.g., text corpus described below). In an unsupervised learning process, the machine learning model may be left to interpret large data sets and address that data accordingly. The model tries to organize that data in some way to describe its structure. This might mean grouping the data into clusters or arranging it in a way that looks more organized. Unsupervised learning may use techniques such as clustering and/or dimension reduction.


Reinforcement learning may focus on regimented learning processes, where the machine learning model may be provided with a set of actions, parameters, and/or end values. By defining the rules, the machine learning model then tries to explore different options and possibilities, monitoring and evaluating each result to determine which one is optimal. Reinforcement learning teaches the model trial and error. The model may learn from past experiences and begins to adapt its approach in response to the situation to achieve the best possible result.


By way of non-limiting illustration, work creation component 114 may be configured to train a machine learning model on input/output pairs to generate a trained machine learning model configured to generate individual records.


The input/output pairs including training input information and training output information. The training input information and/or training output information may be derived from a text corpus. The text corpus may comprise the text making up the pages (e.g., work unit pages, project pages, etc.) for existing records of the collaboration environment. The training output information may include the values of the parameters of the existing records and/or the organization therefore as stored by the system 100. By way of non-limiting illustration, training input information for an individual input/output pair including the text making up an individual work unit page for an individual existing one of the work unit records; and the training output information for the individual input/output pair including the values of the work unit parameters of the individual existing one of the work unit records. By way of non-limiting illustration, training input information for an individual input/output pair including the text making up an individual project page for an individual existing one of the project records; and the training output information for the individual input/output pair including the values of the project parameters of the individual existing one of the project records.


The work creation component 114 may be configured to obtain and/or derive the text corpus comprising the text making up the pages (e.g., work unit pages, project pages, etc.) for existing records of the collaboration environment. The work creation component 114 may be configured to obtain and/or access the existing records. The work creation component 114 may be configured to compile the text from the text corpus and the values from the existing records into the input/output pairs. The work creation component 114 may be configured to train the machine learning model based on multiple sets of input/output pairs to generate the trained machine learning model. The training process may continue until the model achieves a desired level of accuracy on the training data. The work creation component 114 may be configured to store the trained machine learning model. By using the existing records and text of pages, the trained machine learning model may be trained in a way that already reflects how users create records. The text corpus provides insight on how users actually define elements of a record (e.g., by the words they use). By using this text in relation to how the records are ultimately defined and stored by the system 100, the trained machine learning model may be well adapted to generate records from the transcripts.


In some implementations, the text corpus may comprise text derived from a particular set of one or more parameters of the existing records. By way of non-limiting illustration, the text corpus may be focused the “name” (or “title”) of the record as it appears on a page. By way of non-limiting illustration, the text corpus may be focused the “description” of the record as it appears on a page. The text corpus may be focused on other particular parameters.


In some implementations, the text corpus may comprise text making up the pages for particular types of records of the collaboration environment. The text corpus may comprise text making up the pages for work unit records so that the trained machine learning model may be particularly adapted to generating work unit records. The text corpus may comprise text making up the pages for project records so that the trained machine learning model may be particularly adapted to generating project records. The text corpus may comprise text making up the pages for objective records so that the trained machine learning model may be particularly adapted to generating objective records. The text corpus may comprise text making up the pages for user records so that the trained machine learning model may be particularly adapted to generating user records.


In some implementations, a text corpus may be derived from existing records of some or all users of the system 100 and used to train a machine learning model that applies to some or all users. In some implementations, a text corpus may be derived from a set of existing records specific to a set of users and used to train a machine learning model that is specific to the set of users. By way of non-limiting illustration, a set of users may be associated with a given business entity such that the business entity may have an account within system 100 which provides access to the set of users (e.g., it's employees). The text corpus may be derived from the set of existing records specific to the set of users such that the resulting trained machine learning model may be particularly adapted to the style, grammar, preferences, lexicography, and/or other attributes particular to the set of users. By adapting the machine learning model to a specific set of users, the output from the training machine learning model may be more accurate and/or relevant for that set of users.


By way of non-limiting illustration, the machine learning model may be further trained to identify triggers from the text corpus. Triggers may refer to words, phrases, and/or other content that may trigger generation of one or more records. Triggers may refer to words, phrases, and/or other content that may trigger specification of one or more values of one or more parameters of the records. In some implementations, triggers may be user-specific, team-specific, and/or system-wide. In some implementations, one or more words and/or phrases may be identified from the text corpus as trigger words and/or phrases. Trigger phrases and/or words may include words and/or phrases conveying one or more of action items or tasks to be completed, intent, desire, wants, and/or other information. In some implementations, the trigger phrases and/or words may include words accompanied by one or more of will you, can you, could you, please, by a given date, before the given date, we need, I need, I want, would be grateful if, you could, and/or other phrases and/or words.


The work creation component 114 may be configured to provide the transcription information and/or other information as input into the trained machine-learning model. The trained machine-learning model being configured to provide output including new ones of the records. By way of non-limiting illustration, the input into the trained machine-learning model may include the first transcription information and/or other information.


The work creation component 114 may be configured to obtain the output from the trained machine-learning model. The output including new ones of the records may include specification of one or more values of one or more parameters. In some implementations, individual values may include individual text and/or text strings from the transcripts. In some implementations, individual values may include individual meanings of the utterances. In some implementations, individual values may include individual features and/or characterizations thereof from the video.


By way of non-limiting illustration, the output defining a new work unit record may include one or more values of one or more work unit parameters. By way of non-limiting illustration, the output may include a first work unit record based on the input of the first transcription information and/or other information. The output may include the first work unit record by virtue of including a first set of values for the work unit parameters for the first work unit record. In some implementations, the first set of values may include one or more of a first value for a title parameter, a second value of a work assignment parameter, a third value for a due date parameter user, and/or other values. In some implementations, individual ones of the values in the first set of values include individual text and/or text strings in the first set of text strings. In some implementations, individual ones of the values in the first set of values include individual meanings of the utterances.


In some implementations, the environment state component 108 may be configured to, in response to the generating new records, generate and/or store resource information in the new records. The resource information may include one or more of copies of the digital assets, resource identifiers that facilitate access to the digital assets, and/or other information. A resource identifier may include one or more of a pointer, a hyperlink, and/or other identifier configured to provide access to an individual record (e.g., a source record and/or other record). In some implementations, resource information may include timestamp information that identifies one or more points in time within recorded content from which records were generated. For example, transcripts may have content (e.g., words and/or phrases) timestamped in accordance with when they appear in recorded content. When the content results in a record being generated, the timestamp of that content may be recorded and included in the resource information.


Storing the resource information in the individual records may cause individual resource identifiers and/or timestamp information to be presented in individual pages of the individual records. By presenting timestamp information, users will know where the record originated from, and can quickly refer back to it for reference if needed. For example, a page for a record may include a link to recorded content and a notification of a point in time within the recorded content that drove the generation of the record (e.g., “Minute 34 of the Audio recording X”).


In some implementations, environment state information may be updated as users continue to interact with the collaboration environment via the user interfaces over time. The environment state component 108 may store and/or archive the environment state information periodically and/or based on user request to archive. In some implementations, the environment state component 108 may store historical environment state information specifying historical user information, historical work information, historical project information, historical objective information, user interaction history, and/or other information.


In some implementations, the trained machine learning model may be updated/refined based on user corrections and/or validations to the generated records. For example, a user accessing a page for a record generated based on recorded content may be prompted to correct and/or validate information appearing on the page. For example, the user may refine/change a title, description, and/or other information. The corrections and/or validations may cause the model to adapt and/or learn so that it may achieve a high level of accuracy/performance. In some implementations, a user may be guided through a process walking the user from the recorded content to the page for the generated record. Prompts or notifications may be presented during the walkthrough which represent decision(s) made by the machine learning model, where the user can then correct and/or validate the decision(s). This may provide a user-friendly way to validate what was generated and may be an expansion of the training of the model itself.



FIG. 3 illustrates a user interface 300 of a collaboration environment, in accordance with one or more implementations. The user interface 300 may include a view of a collaboration environment. In particular, the user interface 300 may comprise a work unit page 302 for a unit of work 301 from text may be derived as part of a text corpus for training a machine learning model. The user interface 300 may display values of one or more work unit parameters, and/or other information. By way of non-limiting illustration, a user interface element 301 may display a title of the unit of work 301. A user interface element 303 may display a due date of unit of work 301. A user interface element 305 may display an assignee of unit of work 301. A user interface element 304 may display a description of the unit of work 301.



FIG. 4 illustrates a user interface 400, in accordance with one or more implementations. The user interface 400 may be a user interface through which users upload digital assets to generate records. The user interface 400 may include an input portion 402 configured to receive user input of individual digital assets. The user interface 400 may include a user interface element 401 that provides access to a page of the collaboration from which the user interface 400 was accessed. By way of non-limiting illustration, based on requesting to upload a digital asset within a project page for “Project Q”, user interface 400 may be presented. The project record for Project Q may act as a source record for the subsequent creation of one or more records.


A user may input a digital asset into input portion 402, for example, a file representing recorded audio and/or video content (e.g., from a recorded meeting). Processing of the file may generate a transcript of the recording. By way of non-limiting illustration, the transcript may characterize user utterance that identify needs and/or desires of a speaker, identify a person to carry out work, identify work to be done, and/or other characterizations.



FIG. 5 illustrates a user interface 500 of a collaboration environment, in accordance with one or more implementations. The user interface 500 may include a view of a collaboration environment. In particular, the user interface 500 may comprise a work unit page 502 for a unit of work 501 obtained from output of a trained machine learning model that has been provided a transcript as input. The user interface 500 may display values of one or more work unit parameters, and/or other information. One or more of the values may reflect information learned from existing records and/or pages from the records. By way of non-limiting illustration, a user interface element 501 may display a title of the unit of work 501. A user interface element 503 may display a due date of unit of work 501. A user interface element 505 may display an assignee of unit of work 501. A user interface element 504 may display a description of the unit of work 501. The title, for example, may have a format or flow that follows what was learned from work unit page 302 in FIG. 3. The work unit page 502 may include a link 504 to the digital asset from which the unit of work was generated, which may be stored in the new work unit record and/or other record following upload.


Referring back to FIG. 1, in some implementations, server(s) 102, client computing platform(s) 104, and/or external resources 126 may be operatively linked via one or more electronic communication links. For example, such electronic communication links may be established, at least in part, via a network 116 such as the Internet and/or other networks. It will be appreciated that this is not intended to be limiting, and that the scope of this disclosure includes implementations in which server(s) 102, client computing platform(s) 104, and/or external resource(s) 126 may be operatively linked via some other communication media.


A given client computing platform may include one or more processors configured to execute computer program components. The computer program components may be configured to enable an expert or user associated with the given client computing platform to interface with system 100 and/or external resource(s) 126, and/or provide other functionality attributed herein to client computing platform(s) 104. By way of non-limiting example, the given client computing platform 104 may include one or more of a desktop computer, a laptop computer, a handheld computer, a tablet computing platform, a NetBook, a Smartphone, a gaming console, and/or other computing platforms.


External resource(s) 126 may include sources of information outside of system 100, external entities participating with system 100, and/or other resources. In some implementations, some or all of the functionality attributed herein to external resource(s) 126 may be provided by resources included in system 100.


Server(s) 102 may include electronic storage 128, one or more processors 130, and/or other components. Server(s) 102 may include communication lines, or ports to enable the exchange of information with a network 116 and/or other computing platforms. Illustration of server(s) 102 in FIG. 1 is not intended to be limiting. Server(s) 102 may include a plurality of hardware, software, and/or firmware components operating together to provide the functionality attributed herein to server(s) 102. For example, server(s) 102 may be implemented by a cloud of computing platforms operating together as server(s) 102.


Electronic storage 128 may comprise non-transitory storage media that electronically stores information. The electronic storage media of electronic storage 128 may include one or both of system storage that is provided integrally (i.e., substantially non-removable) with server(s) 102 and/or removable storage that is removably connectable to server(s) 102 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). Electronic storage 128 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. Electronic storage 128 may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources). Electronic storage 128 may store software algorithms, information determined by processor(s) 130, information received from server(s) 102, information received from client computing platform(s) 104, and/or other information that enables server(s) 102 to function as described herein.


Processor(s) 130 may be configured to provide information processing capabilities in server(s) 102. As such, processor(s) 130 may include one or more of a digital processor, a physical processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information. Although processor(s) 130 is shown in FIG. 1 as a single entity, this is for illustrative purposes only. In some implementations, processor(s) 130 may include a plurality of processing units. These processing units may be physically located within the same device, or processor(s) 130 may represent processing functionality of a plurality of devices operating in coordination. Processor(s) 130 may be configured to execute components 108, 110, 112, 114, and/or other components. Processor(s) 130 may be configured to execute components 108, 110, 112, and/or 114, and/or other components by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor(s) 130. As used herein, the term “component” may refer to any component or set of components that perform the functionality attributed to the component. This may include one or more physical processors during execution of processor readable instructions, the processor readable instructions, circuitry, hardware, storage media, or any other components.


It should be appreciated that although components 108, 110, 112 and/or 114 are illustrated in FIG. 1 as being implemented within a single processing unit, in implementations in which processor(s) 130 includes multiple processing units, one or more of components 108, 110, 112, and/or 114 may be implemented remotely from the other components. The description of the functionality provided by the different components 108, 110, 112, and/or 114 described below is for illustrative purposes, and is not intended to be limiting, as any of components 108, 110, 112, and/or 114 may provide more or less functionality than is described. For example, one or more of components 108, 110, 112, and/or 114 may be eliminated, and some or all of its functionality may be provided by other ones of components 108, 110, 112, and/or 114. As another example, processor(s) 130 may be configured to execute one or more additional components that may perform some or all of the functionality attributed below to one of components 108, 110, 112, and/or 114.



FIG. 2 illustrates a method 200 to generate records within a collaboration environment, in accordance with one or more implementations. The operations of method 200 presented below are intended to be illustrative. In some implementations, method 200 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 200 are illustrated in FIG. 2 and described below is not intended to be limiting.


In some implementations, method 200 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 200 in response to instructions stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 200.


An operation 202 may manage environment state information maintaining a collaboration environment. The collaboration environment may be configured to facilitate interaction by users with the collaboration environment. The environment state information may include one or more records. The one or more records may include work unit records and/or other records. The work unit records may include work information and/or other information. The work information may characterize units of work created, managed, and/or assigned to within the collaboration environment to the users who are expected to accomplish one or more actions to complete the units of work. The work information may comprise values of work unit parameters. Operation 202 may be performed by one or more hardware processors configured by machine-readable instructions including a component that is the same as or similar to environment state component 108, in accordance with one or more implementations.


An operation 204 may effectuate presentation of a user interface through which users upload digital assets representing video and/or audio content. The video and/or audio content may include human utterances and/or other content. The user interface may include one or more portions. The one or more portions may include an input portion. The input portion may be configured to receive user input of individual ones of the digital assets. Operation 204 may be performed by one or more hardware processors configured by machine-readable instructions including a component that is the same as or similar to user interface component 110, in accordance with one or more implementations.


An operation 206 may obtain input information defining the digital assets input via the user interface. The input information may define a first digital asset input into the user interface via the input portion. The first digital asset may include first recorded audio and/or video content including a first set of utterances. Operation 206 may be performed by one or more hardware processors configured by machine-readable instructions including a component that is the same as or similar to user interface component 110, in accordance with one or more implementations.


An operation 208 may, in response to obtaining the input information, generate transcription information and/or other information. The transcription information may characterize the audio and/or video content. The transcription information may include transcripts and/or other information. The transcripts may include text strings characterizing one or more of the utterances, physical gestures, facial expressions, physical movement, and/or other content. By way of non-limiting illustration, first transcription information may be generated from the first recorded audio and/or video content of the first digital asset. The first transcription information may include a first transcript comprising a first set of text strings characterizing the first set of utterances and/or other content. Operation 208 may be performed by one or more hardware processors configured by machine-readable instructions including a component that is the same as or similar to content component 112, in accordance with one or more implementations.


An operation 210 may generate individual records by providing transcription information as input into a trained machine-learning model. The trained machine-learning model may be configured to provide output based on input(s). The output may include new records. The output may include definitions of the new ones of the records based on the transcripts and/or other content. The output may define the new ones of the records by including one or more values of one or more parameters of the new ones of the new one so the records. By way of non-limiting illustration, the input into the trained machine-learning model may include the first transcription information, and the output may include a first work unit record. Operation 210 may be performed by one or more hardware processors configured by machine-readable instructions including a component that is the same as or similar to work creation component 114, in accordance with one or more implementations.


Although the present technology has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the technology is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present technology contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.

Claims
  • 1. A system configured to generate work unit records within a collaboration environment, the system comprising: one or more physical processors configured by machine-readable instructions to: manage, by a server, electronically stored environment state information maintaining a collaboration environment, the collaboration environment being configured to facilitate interaction by users with the collaboration environment, the users interacting with the collaboration environment through remotely located client computing platforms communicating with the server over one or more network connections, the environment state information including work unit records, the work unit records including work information characterizing units of work created within the collaboration environment, managed by the users within the collaboration environment, and assigned to within the collaboration environment to the users who are expected to accomplish one or more actions to complete the units of work, the work information comprising values of work unit parameters characterizing the units of work;establish the one or more network connections between the server and the remotely located client computing platforms;obtain, by the server, input information defining digital assets for recorded audio content, the recorded audio content including utterances, wherein the input information defines a first digital asset for first recorded audio content including a first set of utterances;generate, by the server, transcription information characterizing the recorded audio content of the digital assets, the transcription information including transcripts of the utterances, the transcripts including text strings characterizing the utterances, wherein first transcription information is generated from the first recorded audio content of the first digital asset, the first transcription information including a first transcript comprising a first set of text strings characterizing the first set of utterances;train, by the server, a machine-learning model based on a text corpus to generate a trained machine-learning model, the text corpus comprising user-generated text that makes up pages of a graphical user interface of the collaboration environment through which the users access the work unit records, the trained machine-learning model being configured to provide output including new work information defining new ones of the work unit records;provide, by the server, the transcription information as input into the trained machine-learning model, such that the input into the trained machine-learning model includes the first transcription information;obtain, by the server, the output from the trained machine-learning model, the output including the new work information defining the new ones of the work unit records based on the transcripts, wherein the new work information defining the new ones of the work unit records includes the values of one or more of the work unit parameters of the new ones of the work unit records, such that the output includes first new work information of a first work unit record based on the input of the first transcription information;generate, by the server, user interface information defining new pages of the graphical user interface of the collaboration environment through which the users access the new work information defining the new ones of the work unit records;effectuate communication of the user interface information from the server to the remotely located client computing platforms over the one or more network connections to cause the remotely located client computing platforms to present the new pages, such that first user interface information defining a first new page associated with the first work unit record is communicated to a first remotely located client computing platform to cause the first remotely located client computing platform to present the first new page through which a first user accesses the first new work information;obtain, by the server, further input information conveying user input into the new pages that corrects and/or validates the new work information appearing on the new pages; andrefine, by the server, the trained machine-learning model based on whether the new work information has been corrected and/or validated through the user input into the new pages, such that the trained machine-learning model is refined in response to the first new work information appearing on the first new page being corrected or validated.
  • 2. The system of claim 1, wherein the digital assets include video files comprising the recorded audio content and visual content.
  • 3. The system of claim 1, wherein generating the transcription information includes performing speech processing on the recorded audio content and semantic natural language processing to determine meaning of the utterances, wherein the meaning of the utterances are included in the transcripts.
  • 4. The system of claim 1, wherein the one or more physical processors are further configured by the machine-readable instructions to: compile, by the server, the text corpus into input/output training pairs, wherein training the machine-learning model based on the text corpus comprises training the machine-learning model on the input/output training pairs to generate the trained machine-learning model, the input/output training pairs including training input information and training output information, the training input information for an individual input/output training pair including an individual set of the user-generated text making up an individual work unit page for an individual existing one of the work unit records, and the training output information for the individual input/output training pair including the values of the work unit parameters of the individual existing one of the work unit records; andstore the trained machine-learning model.
  • 5. The system of claim 1, wherein the one or more physical processors are further configured by the machine-readable instructions to: further provide context information as the input into the trained machine-learning model, the context information specifying context of the digital assets, the context including one or more users linked to individual ones of the digital assets.
  • 6. The system of claim 5, wherein a user linked to a digital asset includes one or more of a creator of the digital asset or an uploader of the digital asset.
  • 7. The system of claim 1, wherein the output includes the first work unit record by virtue of including a first set of values for the work unit parameters for the first work unit record.
  • 8. The system of claim 7, wherein the first set of values includes one or more of a first value for a title parameter, a second value of a work assignment parameter, and/or a third value for a due date parameter user.
  • 9. The system of claim 7, wherein individual ones of the values in the first set of values include individual text strings in the first set of text strings.
  • 10. The system of claim 7, wherein individual ones of the values in the first set of values include individual meanings of the utterances.
  • 11. A method to generate work unit records within a collaboration environment, the method comprising: managing, by a server, electronically stored environment state information maintaining a collaboration environment, the collaboration environment being configured to facilitate interaction by users with the collaboration environment, the users interacting with the collaboration environment through remotely located client computing platforms communicating with the server over one or more network connections, the environment state information including work unit records, the work unit records including work information characterizing units of work created within the collaboration environment, managed by the users within the collaboration environment, and assigned within the collaboration environment to the users who are expected to accomplish one or more actions to complete the units of work, the work information comprising values of work unit parameters characterizing the units of work;establishing the one or more network connections between the server and the remotely located client computing platforms;obtaining, by the server, input information defining digital assets for recorded audio content, the recorded audio content including utterances, wherein the input information defines a first digital asset for first recorded audio content including a first set of utterances;generating, by the server, transcription information characterizing the recorded audio content of the digital assets, the transcription information including transcripts of the utterances, the transcripts including text strings characterizing the utterances, wherein first transcription information is generated from the first recorded audio content of the first digital asset, the first transcription information including a first transcript comprising a first set of text strings characterizing the first set of utterances;training, by the server, a machine-learning model based on a text corpus to generate a trained machine-learning model, the text corpus comprising user-generated text that makes up pages of a graphical user interface of the collaboration environment through which the users access the work unit records, the trained machine-learning model being configured to provide output including new work information defining new ones of the work unit records;providing, by the server, the transcription information as input into the trained machine-learning model, the input into the trained machine-learning model including the first transcription information;obtaining, by the server, the output from the trained machine-learning model, the output including the new work information defining the new ones of the work unit records based on the transcripts, wherein the new work information defining the new ones of the work unit records includes the values of one or more of the work unit parameters of the new ones of the work unit records, the output including first new work information of a first work unit record based on the input of the first transcription information;generating, by the server, user interface information defining new pages of the graphical user interface of the collaboration environment through which the users access the new work information defining the new ones of the work unit records;effectuating communication of the user interface information from the server to the remotely located client computing platforms over the one or more network connections to cause the remotely located client computing platforms to present the new pages, including effectuating communication of first user interface information defining a first new page associated with the first work unit record to a first remotely located client computing platform to cause the first remotely located client computing platform to present the first new page through which a first user accesses the first new work information;obtaining, by the server, further input information conveying user input into the new pages that corrects and/or validates the new work information appearing on the new pages; andrefining, by the server, the trained machine-learning model based on whether the new work information has been corrected and/or validated through the user input into the new pages, including refining the trained machine-learning model in response to the first new work information appearing on the first new page being corrected or validated.
  • 12. The method of claim 11, wherein the digital assets include video files comprising the recorded audio content and visual content.
  • 13. The method of claim 11, wherein the generating the transcription information includes performing speech processing on the recorded audio content and semantic natural language processing to determine meaning of the utterances, wherein the meaning of the utterances is included in the transcripts.
  • 14. The method of claim 11, further comprising: compiling, by the server, the text corpus into input/output training pairs, wherein the training of the machine-learning model based on the text corpus comprises training the machine-learning model on the input/output training pairs to generate the trained machine-learning model, the input/output training pairs including training input information and training output information, the training input information for an individual input/output training pair including an individual set of the user-generated text making up an individual work unit page for an individual existing one of the work unit records, and the training output information for the individual input/output training pair including the values of the work unit parameters of the individual existing one of the work unit records; andstoring the trained machine-learning model in non-transitory electronic storage.
  • 15. The method of claim 11, further comprising: further providing context information as the input into the trained machine-learning model, the context information specifying context of the digital assets, the context including one or more users linked to individual ones of the digital assets.
  • 16. The method of claim 15, wherein a user linked to a digital asset includes one or more of a creator of the digital asset or an uploader of the digital asset.
  • 17. The method of claim 11, wherein the output includes the first work unit record by virtue of including a first set of values for the work unit parameters for the first work unit record.
  • 18. The method of claim 17, wherein the first set of values include one or more of a first value for a title parameter, a second value of a work assignment parameter, and/or a third value for a due date parameter user.
  • 19. The method of claim 17, wherein individual ones of the values in the first set of values include individual text strings in the first set of text strings.
  • 20. The method of claim 17, wherein individual ones of the values in the first set of values include individual meanings of the utterances.
US Referenced Citations (579)
Number Name Date Kind
5233687 Henderson, Jr. Aug 1993 A
5524077 Faaland Jun 1996 A
5530861 Diamant Jun 1996 A
5608898 Turpin Mar 1997 A
5611076 Durflinger Mar 1997 A
5623404 Collins Apr 1997 A
5721770 Kohler Feb 1998 A
5983277 Heile Nov 1999 A
6024093 Cron Feb 2000 A
6256651 Tuli Jul 2001 B1
6292830 Taylor Sep 2001 B1
6332147 Moran Dec 2001 B1
6385639 Togawa May 2002 B1
6621505 Beauchamp Sep 2003 B1
6629081 Cornelius Sep 2003 B1
6769013 Frees Jul 2004 B2
6859523 Jilk Feb 2005 B1
7020697 Goodman Mar 2006 B1
7039596 Lu May 2006 B1
7086062 Faour Aug 2006 B1
7213051 Zhu May 2007 B2
7349920 Feinberg Mar 2008 B1
7418482 Lusher Aug 2008 B1
7428723 Greene Sep 2008 B2
7640511 Keel Dec 2009 B1
7676542 Moser Mar 2010 B2
7779039 Weissman Aug 2010 B2
7805327 Schulz Sep 2010 B1
RE41848 Daniell Oct 2010 E
7844454 Coles Nov 2010 B2
7917855 Satish Mar 2011 B1
7996744 Ojala Aug 2011 B2
7996774 Sidenur Aug 2011 B1
8214747 Yankovich Jul 2012 B1
8314809 Grabowski Nov 2012 B1
8499300 Zimberg Jul 2013 B2
8522240 Merwarth Aug 2013 B1
8527287 Bhatia Sep 2013 B1
8554832 Moskovitz Oct 2013 B1
8572477 Moskovitz Oct 2013 B1
8627199 Handley Jan 2014 B1
8639552 Chen Jan 2014 B1
8768751 Jakowski Jul 2014 B2
8831879 Stamm Sep 2014 B2
8843832 Frields Sep 2014 B2
8863021 Bee Oct 2014 B1
9009096 Pinckney Apr 2015 B2
9024752 Tumayan May 2015 B2
9143839 Reisman Sep 2015 B2
9152668 Moskovitz Oct 2015 B1
9201952 Chau Dec 2015 B1
9208262 Bechtel Dec 2015 B2
9251484 Cantor Feb 2016 B2
9299039 Wang Mar 2016 B1
9350560 Hupfer May 2016 B2
9383917 Mouton Jul 2016 B2
9405532 Sullivan Aug 2016 B1
9405810 Smith Aug 2016 B2
9454623 Kaptsan Sep 2016 B1
9514424 Kleinbart Dec 2016 B2
9600136 Yang Mar 2017 B1
9674361 Ristock Jun 2017 B2
9712576 Gill Jul 2017 B1
9785445 Mitsui Oct 2017 B2
9830398 Schneider Nov 2017 B2
9842312 Rosati Dec 2017 B1
9949681 Badenes Apr 2018 B2
9953282 Shaouy Apr 2018 B2
9959420 Kiang May 2018 B2
9978040 Lee May 2018 B2
9990636 Lewis Jun 2018 B1
10001911 Breedvelt-Schouten Jun 2018 B2
10003693 Wolthuis Jun 2018 B2
10083412 Suntinger Sep 2018 B2
10157355 Johnson Dec 2018 B2
10171256 Faulkner Jan 2019 B2
10192181 Katkar Jan 2019 B2
10235156 Johnson Mar 2019 B2
10264067 Subramani Apr 2019 B2
10308992 Chauvin Jun 2019 B2
10373084 Kurjanowicz Aug 2019 B2
10373090 Holm Aug 2019 B2
10382501 Malatesha Aug 2019 B2
10455011 Kendall Oct 2019 B2
10496943 De Dec 2019 B2
10594788 Larabie-Belanger Mar 2020 B2
10606859 Smith Mar 2020 B2
10613735 Karpe Apr 2020 B1
10616151 Cameron Apr 2020 B1
10623359 Rosenstein Apr 2020 B1
10657501 Choi May 2020 B2
10671692 Koopman Jun 2020 B2
10684870 Sabo Jun 2020 B1
10706484 Murnock Jul 2020 B1
10785046 Raghavan Sep 2020 B1
10810222 Koch Oct 2020 B2
10846105 Granot Nov 2020 B2
10846297 Smith Nov 2020 B2
10871987 Bhalla Dec 2020 B1
10922104 Sabo Feb 2021 B2
10956845 Sabo Mar 2021 B1
10970299 Smith Apr 2021 B2
10977434 Pelz Apr 2021 B2
10983685 Karpe Apr 2021 B2
11062270 Hilleli Jul 2021 B2
11082281 Justin Aug 2021 B2
11095468 Pandey Aug 2021 B1
11113667 Jiang Sep 2021 B1
11138021 Rosenstein Oct 2021 B1
11140174 Patel Oct 2021 B2
11144854 Mouawad Oct 2021 B1
11159470 Botwick Oct 2021 B1
11170761 Thomson Nov 2021 B2
11204683 Sabo Dec 2021 B1
11212242 Cameron Dec 2021 B2
11263228 Koch Mar 2022 B2
11287946 Jackson Mar 2022 B2
11288081 Sabo Mar 2022 B2
11290296 Raghavan Mar 2022 B2
11327645 Karpe May 2022 B2
11341444 Sabo May 2022 B2
11341445 Cheng May 2022 B1
11496711 Cronan Nov 2022 B1
11627006 Chew Apr 2023 B1
20020029294 Ueno Mar 2002 A1
20020062367 Debber May 2002 A1
20020065798 Bostleman May 2002 A1
20020082889 Oliver Jun 2002 A1
20020143594 Kroeger Oct 2002 A1
20030028595 Vogt Feb 2003 A1
20030036934 Ouchi Feb 2003 A1
20030041317 Sokolov Feb 2003 A1
20030065722 Ieperen Apr 2003 A1
20030097406 Stafford May 2003 A1
20030097410 Atkins May 2003 A1
20030126001 Northcutt Jul 2003 A1
20030200223 Hack Oct 2003 A1
20030225598 Yu Dec 2003 A1
20030233265 Lee Dec 2003 A1
20030233268 Taqbeem Dec 2003 A1
20040083448 Schulz Apr 2004 A1
20040093290 Doss May 2004 A1
20040093351 Lee May 2004 A1
20040098291 Newburn May 2004 A1
20040119752 Beringer Jun 2004 A1
20040125150 Adcock Jul 2004 A1
20040162833 Jones Aug 2004 A1
20040187089 Schulz Sep 2004 A1
20040207249 Baumgartner Oct 2004 A1
20040210470 Rusk Oct 2004 A1
20040230447 Schwerin-Wenzel Nov 2004 A1
20040268451 Robbin Dec 2004 A1
20050028158 Ferguson Feb 2005 A1
20050210394 Crandall Sep 2005 A1
20050216111 Ooshima Sep 2005 A1
20050222971 Cary Oct 2005 A1
20060028917 Wigginton Feb 2006 A1
20060030992 Iwatsuki Feb 2006 A1
20060047454 Tamaki Mar 2006 A1
20060075347 Rehm Apr 2006 A1
20060085245 Takatsuka Apr 2006 A1
20060095859 Bocking May 2006 A1
20060136441 Fujisaki Jun 2006 A1
20060143270 Wodtke Jun 2006 A1
20060167736 Weiss Jul 2006 A1
20060190391 Cullen Aug 2006 A1
20060200264 Kodama Sep 2006 A1
20060218551 Berstis Sep 2006 A1
20060224430 Butt Oct 2006 A1
20060277487 Poulsen Dec 2006 A1
20070016465 Schaad Jan 2007 A1
20070016646 Tendjoukian Jan 2007 A1
20070025567 Fehr Feb 2007 A1
20070038494 Kreitzberg Feb 2007 A1
20070041542 Schramm Feb 2007 A1
20070050225 Leslie Mar 2007 A1
20070073575 Yomogida Mar 2007 A1
20070143169 Grant Jun 2007 A1
20070147178 Masuda Jun 2007 A1
20070150327 Dromgold Jun 2007 A1
20070188901 Heckerman Aug 2007 A1
20070232278 May Oct 2007 A1
20070255674 Mahoney Nov 2007 A1
20070255715 Li Nov 2007 A1
20070260499 Greef Nov 2007 A1
20070288283 Fitzpatrick Dec 2007 A1
20070294344 Mohan Dec 2007 A1
20080033777 Shukoor Feb 2008 A1
20080046471 Moore Feb 2008 A1
20080079730 Zhang Apr 2008 A1
20080082389 Gura Apr 2008 A1
20080082956 Gura Apr 2008 A1
20080091782 Jakobson Apr 2008 A1
20080120129 Seubert May 2008 A1
20080126930 Scott May 2008 A1
20080134069 Horvitz Jun 2008 A1
20080155547 Weber Jun 2008 A1
20080158023 Chung Jul 2008 A1
20080167937 Coughlin Jul 2008 A1
20080175104 Grieb Jul 2008 A1
20080195964 Randell Aug 2008 A1
20080221946 Balon Sep 2008 A1
20080222566 Daughtrey Sep 2008 A1
20080244582 Brown Oct 2008 A1
20080268876 Gelfand Oct 2008 A1
20080270198 Graves Oct 2008 A1
20080281665 Opaluch Nov 2008 A1
20080313004 Ryan Dec 2008 A1
20090018835 Cooper Jan 2009 A1
20090048986 Anderson Feb 2009 A1
20090055796 Springborn Feb 2009 A1
20090076878 Woerner Mar 2009 A1
20090089133 Johnson Apr 2009 A1
20090089225 Baier Apr 2009 A1
20090089682 Baier Apr 2009 A1
20090089701 Baier Apr 2009 A1
20090094623 Chakra Apr 2009 A1
20090100340 Paek Apr 2009 A1
20090113310 Appleyard Apr 2009 A1
20090133027 Gunning May 2009 A1
20090167553 Hong Jul 2009 A1
20090187454 Khasin Jul 2009 A1
20090199192 Laithwaite Aug 2009 A1
20090204463 Burnett Aug 2009 A1
20090204471 Elenbaas Aug 2009 A1
20090234699 Steinglass Sep 2009 A1
20090241053 Augustine Sep 2009 A1
20090260010 Burkhart Oct 2009 A1
20090287523 Lau Nov 2009 A1
20090296908 Lee Dec 2009 A1
20090299803 Lakritz Dec 2009 A1
20090307319 Dholakia Dec 2009 A1
20100005087 Basco Jan 2010 A1
20100070888 Watabe Mar 2010 A1
20100088137 Weiss Apr 2010 A1
20100106627 O'Sullivan Apr 2010 A1
20100114786 Aboujaoude May 2010 A1
20100115523 Kuschel May 2010 A1
20100122334 Stanzione May 2010 A1
20100131860 Dehaan May 2010 A1
20100145801 Chekuri Jun 2010 A1
20100169146 Hoyne Jul 2010 A1
20100169802 Goldstein Jul 2010 A1
20100180212 Gingras Jul 2010 A1
20100223575 Leukart Sep 2010 A1
20100269049 Fearon Oct 2010 A1
20100299171 Lau Nov 2010 A1
20100312605 Mitchell Dec 2010 A1
20100313151 Wei Dec 2010 A1
20100332236 Tan Dec 2010 A1
20110015961 Chan Jan 2011 A1
20110022662 Barber-Mingo Jan 2011 A1
20110054968 Galaviz Mar 2011 A1
20110055177 Chakra Mar 2011 A1
20110060720 Devereux Mar 2011 A1
20110071878 Gingras Mar 2011 A1
20110071893 Malhotra Mar 2011 A1
20110072372 Fritzley Mar 2011 A1
20110093538 Weir Apr 2011 A1
20110093619 Nelson Apr 2011 A1
20110113365 Kimmerly May 2011 A1
20110154216 Aritsuka Jun 2011 A1
20110161128 Barney Jun 2011 A1
20110184768 Norton Jul 2011 A1
20110270644 Roncolato Nov 2011 A1
20110307100 Schmidtke Dec 2011 A1
20110307772 Lloyd Dec 2011 A1
20120030194 Jain Feb 2012 A1
20120035925 Friend Feb 2012 A1
20120035942 Graupner Feb 2012 A1
20120066030 Limpert Mar 2012 A1
20120066411 Jeide Mar 2012 A1
20120072251 Mircean Mar 2012 A1
20120079449 Sanderson Mar 2012 A1
20120110087 Culver May 2012 A1
20120117499 Mori May 2012 A1
20120123835 Chu May 2012 A1
20120131191 May May 2012 A1
20120151311 Mathai Jun 2012 A1
20120158946 Shafiee Jun 2012 A1
20120192086 Ghods Jul 2012 A1
20120221963 Motoyama Aug 2012 A1
20120239451 Caligor Sep 2012 A1
20120254218 Ali Oct 2012 A1
20120266068 Ryman Oct 2012 A1
20120278388 Kleinbart Nov 2012 A1
20120296993 Heyman Nov 2012 A1
20120304187 Maresh Nov 2012 A1
20120317108 Okazaki Dec 2012 A1
20130007332 Teh Jan 2013 A1
20130013560 Goldberg Jan 2013 A1
20130014023 Lee Jan 2013 A1
20130018688 Nudd Jan 2013 A1
20130021629 Kurilin Jan 2013 A1
20130024452 Defusco Jan 2013 A1
20130066944 Laredo Mar 2013 A1
20130067375 Kim Mar 2013 A1
20130067549 Caldwell Mar 2013 A1
20130073328 Ehrler Mar 2013 A1
20130103412 Nudd Apr 2013 A1
20130124638 Barreto May 2013 A1
20130151421 Van Der Ploeg Jun 2013 A1
20130151604 Ranade Jun 2013 A1
20130173486 Peters Jul 2013 A1
20130179208 Chung Jul 2013 A1
20130179799 Savage Jul 2013 A1
20130215116 Siddique Aug 2013 A1
20130227007 Savage Aug 2013 A1
20130246110 Nakhayi Ashtiani Sep 2013 A1
20130246399 Schneider Sep 2013 A1
20130275229 Moganti Oct 2013 A1
20130279685 Kohler Oct 2013 A1
20130317871 Kulkarni Nov 2013 A1
20130318447 Deluca Nov 2013 A1
20130321467 Tappen Dec 2013 A1
20130339099 Aidroos Dec 2013 A1
20130339831 Gulanikar Dec 2013 A1
20140007005 Libin Jan 2014 A1
20140012603 Scanlon Jan 2014 A1
20140025767 De Kezel Jan 2014 A1
20140026072 Beaven Jan 2014 A1
20140028826 Lee Jan 2014 A1
20140036639 Taber Feb 2014 A1
20140039962 Nudd Feb 2014 A1
20140040780 Brian Feb 2014 A1
20140040905 Tadanobu Feb 2014 A1
20140058801 Deodhar Feb 2014 A1
20140059910 Norton Mar 2014 A1
20140074536 Meushar Mar 2014 A1
20140089719 Daum Mar 2014 A1
20140101310 Savage Apr 2014 A1
20140156539 Brunet Jun 2014 A1
20140165001 Shapiro Jun 2014 A1
20140172478 Vadasz Jun 2014 A1
20140189017 Prakash Jul 2014 A1
20140200944 Henriksen Jul 2014 A1
20140208325 Chen Jul 2014 A1
20140215344 Ligman Jul 2014 A1
20140218372 Missig Aug 2014 A1
20140229609 Wong Aug 2014 A1
20140236663 Smith Aug 2014 A1
20140244334 De Aug 2014 A1
20140257894 Melahn Sep 2014 A1
20140279294 Field-Darragh Sep 2014 A1
20140288987 Liu Sep 2014 A1
20140310047 De Oct 2014 A1
20140310051 Meng Oct 2014 A1
20140350997 Holm Nov 2014 A1
20140364987 Shikano Dec 2014 A1
20150006448 Gupta Jan 2015 A1
20150007058 Wooten Jan 2015 A1
20150012330 Sugiura Jan 2015 A1
20150052437 Crawford Feb 2015 A1
20150058053 De Feb 2015 A1
20150088499 White Mar 2015 A1
20150100503 Lobo Apr 2015 A1
20150113540 Rabinovici Apr 2015 A1
20150134393 De May 2015 A1
20150149540 Barker May 2015 A1
20150153906 Liao Jun 2015 A1
20150154291 Shepherd Jun 2015 A1
20150169069 Lo Jun 2015 A1
20150181020 Fitzsimmons Jun 2015 A1
20150213411 Swanson Jul 2015 A1
20150215256 Ghafourifar Jul 2015 A1
20150262111 Yu Sep 2015 A1
20150312375 Valey Oct 2015 A1
20150317595 De Nov 2015 A1
20150339006 Chaland Nov 2015 A1
20150363092 Morton Dec 2015 A1
20150363733 Brown Dec 2015 A1
20150379472 Gilmour Dec 2015 A1
20160012368 O'Connell Jan 2016 A1
20160027442 Burton Jan 2016 A1
20160048408 Madhu Feb 2016 A1
20160048786 Fukuda Feb 2016 A1
20160048806 Epson Feb 2016 A1
20160063192 Johnson Mar 2016 A1
20160063449 Duggan Mar 2016 A1
20160072750 Kass Mar 2016 A1
20160110670 Chatterjee Apr 2016 A1
20160124775 Ashtiani May 2016 A1
20160140474 Vekker May 2016 A1
20160140501 Figlin May 2016 A1
20160147773 Smith May 2016 A1
20160147846 Smith May 2016 A1
20160148157 Walia May 2016 A1
20160162819 Hakman Jun 2016 A1
20160180277 Skiba Jun 2016 A1
20160180298 McClement Jun 2016 A1
20160182311 Borna Jun 2016 A1
20160188145 Vida Jun 2016 A1
20160189075 Babb Jun 2016 A1
20160216854 McClellan Jul 2016 A1
20160224939 Chen Aug 2016 A1
20160234391 Wolthuis Aug 2016 A1
20160275436 Kurjanowicz Sep 2016 A1
20160313934 Isherwood Oct 2016 A1
20160328217 Hagerty Nov 2016 A1
20160342927 Reznik Nov 2016 A1
20160344678 MacDonald Nov 2016 A1
20170004213 Cunico Jan 2017 A1
20170004586 Suzuki Jan 2017 A1
20170009387 Ge Jan 2017 A1
20170017364 Kekki Jan 2017 A1
20170017924 Kashiwagi Jan 2017 A1
20170039503 Jones Feb 2017 A1
20170048285 Pearl Feb 2017 A1
20170061341 Haas Mar 2017 A1
20170068933 Norton Mar 2017 A1
20170093874 Uthe Mar 2017 A1
20170097929 Cecchi Apr 2017 A1
20170099296 Fisher Apr 2017 A1
20170103054 Dispezio Apr 2017 A1
20170103369 Thompson Apr 2017 A1
20170116552 Deodhar Apr 2017 A1
20170132200 Noland May 2017 A1
20170153799 Hoyer Jun 2017 A1
20170154024 Subramanya Jun 2017 A1
20170161258 Astigarraga Jun 2017 A1
20170177671 Allgaier Jun 2017 A1
20170185592 Frei Jun 2017 A1
20170192642 Fishman Jul 2017 A1
20170193349 Jothilingam Jul 2017 A1
20170206217 Deshpande Jul 2017 A1
20170249577 Nishikawa Aug 2017 A1
20170310716 Lopez Venegas Oct 2017 A1
20170316367 Candito Nov 2017 A1
20170317898 Candito Nov 2017 A1
20170323233 Bencke Nov 2017 A1
20170323267 Baek Nov 2017 A1
20170323350 Laderer Nov 2017 A1
20170344754 Kumar Nov 2017 A1
20170346861 Pearl Nov 2017 A1
20170351385 Ertmann Dec 2017 A1
20170364866 Steplyk Dec 2017 A1
20180032524 Byron Feb 2018 A1
20180052943 Hui Feb 2018 A1
20180053127 Boileau Feb 2018 A1
20180059910 Wooten Mar 2018 A1
20180060785 Carnevale Mar 2018 A1
20180060818 Ishiyama Mar 2018 A1
20180063063 Yan Mar 2018 A1
20180068271 Abebe Mar 2018 A1
20180075387 Kulkarni Mar 2018 A1
20180088754 Psenka Mar 2018 A1
20180089625 Rosati Mar 2018 A1
20180095938 Monte Apr 2018 A1
20180102989 Borsutsky Apr 2018 A1
20180129995 Fowler May 2018 A1
20180131649 Ma May 2018 A1
20180157477 Johnson Jun 2018 A1
20180165610 Dumant Jun 2018 A1
20180173386 Adika Jun 2018 A1
20180189706 Newhouse Jul 2018 A1
20180189736 Guo Jul 2018 A1
20180211223 Jacobson Jul 2018 A1
20180225618 Shaouy Aug 2018 A1
20180225795 Napoli Aug 2018 A1
20180247352 Rogers Aug 2018 A1
20180247648 Nadimpalli Aug 2018 A1
20180260081 Beaudoin Sep 2018 A1
20180262620 Wolthuis Sep 2018 A1
20180285471 Hao Oct 2018 A1
20180300305 Lam Oct 2018 A1
20180316636 Kamat Nov 2018 A1
20180331842 Faulkner Nov 2018 A1
20180341903 Keen Nov 2018 A1
20180357049 Epstein Dec 2018 A1
20180367477 Hariram Dec 2018 A1
20180367483 Rodriguez Dec 2018 A1
20180373804 Zhang Dec 2018 A1
20190005048 Crivello Jan 2019 A1
20190014070 Mertvetsov Jan 2019 A1
20190018552 Bloy Jan 2019 A1
20190034057 Rudchenko Jan 2019 A1
20190050811 Kang Feb 2019 A1
20190068390 Gross Feb 2019 A1
20190079909 Purandare Mar 2019 A1
20190080289 Kreitler Mar 2019 A1
20190095839 Itabayashi Mar 2019 A1
20190095846 Gupta Mar 2019 A1
20190102700 Babu Apr 2019 A1
20190108834 Nelson Apr 2019 A1
20190130355 Gupta May 2019 A1
20190138583 Silk May 2019 A1
20190138589 Udell May 2019 A1
20190138961 Santiago May 2019 A1
20190139004 Vukovic May 2019 A1
20190147386 Balakrishna May 2019 A1
20190187987 Fauchère Jun 2019 A1
20190213509 Burleson Jul 2019 A1
20190258704 Mertens Aug 2019 A1
20190258985 Daniek Aug 2019 A1
20190265821 Pearl Aug 2019 A1
20190272902 Vozila Sep 2019 A1
20190295041 Sim Sep 2019 A1
20190318321 Lopez Venegas Oct 2019 A1
20190327103 Niekrasz Oct 2019 A1
20190327362 Herrin Oct 2019 A1
20190340296 Cunico Nov 2019 A1
20190340574 Ekambaram Nov 2019 A1
20190347094 Sullivan Nov 2019 A1
20190347126 Bhandari Nov 2019 A1
20190362252 Miller Nov 2019 A1
20190370320 Kalra Dec 2019 A1
20200019907 Notani Jan 2020 A1
20200052921 Van Rensburg Feb 2020 A1
20200059539 Wang Feb 2020 A1
20200065736 Relangi Feb 2020 A1
20200074510 Corodimas Mar 2020 A1
20200104802 Kundu Apr 2020 A1
20200118568 Kudurshian Apr 2020 A1
20200162315 Siddiqi May 2020 A1
20200192538 Karpe Jun 2020 A1
20200192908 Smith Jun 2020 A1
20200193556 Jin Jun 2020 A1
20200218551 Sabo Jul 2020 A1
20200228474 Cameron Jul 2020 A1
20200233879 Papanicolaou Jul 2020 A1
20200242540 Rosati Jul 2020 A1
20200244611 Rosenstein Jul 2020 A1
20200293975 Faulkner Sep 2020 A1
20200328906 Raghavan Oct 2020 A1
20200344253 Kurup Oct 2020 A1
20200349178 Raju Nov 2020 A1
20200349415 Raju Nov 2020 A1
20200349614 Batcha Nov 2020 A1
20200396184 Perazzo Dec 2020 A1
20200403817 Daredia Dec 2020 A1
20200403818 Daredia Dec 2020 A1
20210004380 Koch Jan 2021 A1
20210004381 Smith Jan 2021 A1
20210089860 Heere Mar 2021 A1
20210091969 Bender Mar 2021 A1
20210097466 Sabo Apr 2021 A1
20210097502 Hilleli Apr 2021 A1
20210103451 Sabo Apr 2021 A1
20210110347 Khalil Apr 2021 A1
20210117479 Liu Apr 2021 A1
20210133681 Dhaliwal May 2021 A1
20210134289 Ito May 2021 A1
20210134296 Iturbe Desentis May 2021 A1
20210135959 Ricks May 2021 A1
20210136012 Barbitta May 2021 A1
20210182475 Pelz Jun 2021 A1
20210201271 Vukich Jul 2021 A1
20210209535 Tezak Jul 2021 A1
20210209561 Kishore Jul 2021 A1
20210216562 Smith Jul 2021 A1
20210232282 Karpe Jul 2021 A1
20210280169 Suzuki Sep 2021 A1
20210287673 Kaplan Sep 2021 A1
20210319408 Jorasch Oct 2021 A1
20210320811 Constantinides Oct 2021 A1
20210320891 Rosenstein Oct 2021 A1
20210342786 Jiang Nov 2021 A1
20210365862 Doan Nov 2021 A1
20210366490 Hsieh Nov 2021 A1
20210375289 Zhu Dec 2021 A1
20210375291 Zeng Dec 2021 A1
20210382734 Rosenstein Dec 2021 A1
20220019320 Sabo Jan 2022 A1
20220027834 Zheng Jan 2022 A1
20220058548 Garg Feb 2022 A1
20220058552 Takahashi Feb 2022 A1
20220060345 Wiener Feb 2022 A1
20220075792 Koch Mar 2022 A1
20220078142 Cameron Mar 2022 A1
20220158859 Raghavan May 2022 A1
20220198403 Chen Jun 2022 A1
20220207392 Hou Jun 2022 A1
20220207489 Gupta Jun 2022 A1
20220301557 Braganza Sep 2022 A1
20220377279 Cronan Nov 2022 A1
20230004727 Oberoi Jan 2023 A1
20230029697 Kruk Feb 2023 A1
20230163988 Wells May 2023 A1
20240086848 Hood Mar 2024 A1
Foreign Referenced Citations (7)
Number Date Country
101305350 Nov 2008 CN
101563671 Oct 2009 CN
102378975 May 2015 CN
2011048532 Mar 2011 JP
2015036817 Mar 2015 WO
2015123751 Aug 2015 WO
2020006634 Jan 2020 WO
Non-Patent Literature Citations (52)
Entry
Dawei Li, “Deepcham: Collaborative Edge-Mediated Adaptive Deep Learning for Mobile Object Recognition”, 2016, IEEE/ACM, pp. 64-76. (Year: 2016).
Sarikaya, Ruhi. “The technology behind personal digital assistants: An overview of the system architecture and key components.” IEEE Signal Processing Magazine 34.1 (2017): 67-81. (Year: 2017).
Cabanillas, Cristina, Manuel Resinas, and Antonio Ruiz-Cortés. “A template-based approach for responsibility management in executable business processes.” Enterprise information systems 12.5 (2018): 550-586. (Year: 2018).
Nanos, Antonios G., and Anne E. James. “A virtual meeting system for the new age.” 2013 IEEE 10th International Conference on e-Business Engineering. IEEE, 2013. (Year: 2013).
Shi, Yang, et al. “Meetingvis: Visual narratives to assist in recalling meeting context and content.” IEEE Transactions on Visualization and Computer Graphics 24.6 (2018): 1918-1929. (Year: 2018).
“U.S. Appl. No. 14/584,750, Examiner Interview Summary mailed Feb. 25, 2016”, 3 pgs.
“U.S. Appl. No. 14/584,750, Non Final Office Action mailed Aug. 28, 2015”, 21 pgs.
“U.S. Appl. No. 14/584,750, Notice of Allowance mailed Mar. 28, 2016”, 8 pgs.
“U.S. Appl. No. 14/584,750, Response filed Feb. 9, 2015 to Non Final Office Action mailed Aug. 28, 2015”, 16 pgs.
“U.S. Appl. No. 14/584,850, Final Office Action mailed Sep. 1, 2017”, 31 pgs.
“U.S. Appl. No. 14/584,850, Non Final Office Action mailed Jan. 10, 2017”, 9 pgs.
“U.S. Appl. No. 14/584,850, Response filed Apr. 10, 2017 to Non Final Office Action mailed Jan. 10, 2017”, 13 pgs.
“How to Asana: Inviting teammates to Asana.” YouTube, Asana, Mar. 21, 2017, https://www.youtube.com/watch?v=TLOruY1KyxU ( Year: 2017), 13 pages.
“Rules of Data Conversion from Document to Relational Databases”, published: 2014, publisher: Future-processing, pp. 1-8 (Year: 2014).
(Tiburca, Andrew) Best Team Calendar Applications for 2018-Toggl https://toggl.com/blog/best-team-calendar-applications-for-2018 (Year: 2017).
Asana Demo and Product Tour, you tube excerpt, Dec. 7, 2017 https://www.youtube.com/watch?v=IMAFWVLGFyw (Year: 2017) (16 pages).
Asana integrations, Asana tutorial, youtube, excerpt, Nov. 16, 2016 https://www.youtube.com/watch?v=hBiQ7DJNinE (Year: 2016) (21 pages).
Asana Workload and Portfolios, youtube, excerpt, Aug. 1, 2019 https://www.youtube.com/watch?v=7XkNcfFDG6M (Year: 2019) (20 pages).
Asana YouTube channel, list of all product videos, Nov. 19, 2014-Aug. 19, 2019 https://www.youtube.com/user/AsanaTeam/videos?disable_polymer=1 (Year: 2019) (5 pages).
Asana, Task dependencies, archives org, Aug. 25, 2017 https://web.archive.org/web/20170825002141/https://asana.com/guide/help/tasks/dependencies (Year: 2017) (5 pages).
Asana,Manage your team capacity with Workload, youtube, excerpt, Aug. 1, 2019 https://www.youtube.com/watch?v=2ufXyZDzZnA&list=PLJFG93oi0wJAi UwyOhIGWHdtJzJrzyIBv (Year: 2019) (1 page).
Assef, F., Cassius, T. S., & Maria, T. S. (2018). Confrontation between techniques of time measurement. Journal of Manufacturing Technology Management, 29(5), 789-810. (Year: 2018).
Biggs, “GateGuru Relaunches With New Ways to Streamline Your Travel Experience”, Techcrunch, (Apr. 26, 2013), 3 pgs.
Castaneda Samuel, Introduction Manual—Asana, Sep. 25, 2017 https://static1.squarespace.com/static/586d532ae58c6232db243a65/t/5c210c10f950b7fc7a8e3274/1545669658049/Asana+Manual.pdf (Year: 2017) (20 pages).
Command and control, wikipedia, archives org, Mar. 16, 2018 https://web.archive.org/web/20180316193655/https://en.wikipedia.org/wiki/Command_and_control (Year: 2018), 6 pages.
Creating Tables with Fields from 2 Different Tables, published: 2009, publisher: StackOverflow, pp. 1-2. (Year: 2009).
Critical chain project management, Wikipedia, archives org, Dec. 17, 2016 https://web.archive.Org/web/20161217090326/https://en.wikipedia.org/wiki/Critical_chain_project_management (Year: 2016) 5 pages.
Critical Path Method, Wikipedia, archives org, Sep. 19, 2017 https://web.archive.Org/web/20170919223814/https://en.wikipedia.org/wiki/Critical_path_method (Year: 2017) 6 pages.
Fruhlinger, Joshua. “The Best To-Do ListApps for Feeling Productive; With the right app, feeling productive can be just as gratifying as actually getting things done” Wall Street Journal (Online); New York, N.Y. [New York, N.Y]Nov. 8, 2013 (Year: 2013) 4 pages.
Hartmann, “TimeProjectscheduling with resource capacities and requests varying with time: a case study,” 2013, Flexible services and manufacturing journal, vol. 25, No. 1, pp. 74-93 (Year: 2013).
Helen Mongan-Rallis & Terrie Shannon, “Synchronous Chat,” Aug. 2016, Dept. of Education, Univ. of MN Duluth, web.archive.org/web/20160825183503/https://www.d.umn.edu/hrallis/professional/presentations/cotfsp06/indiv_tools/sync_chat.htm (Year: 2016) (2 pages).
How to Asana Asana time tracking, youtube, excerpt, May 24, 2017 https://www.youtube.com/watch?v=z91qlex-TLc (Year: 2017) (1 page).
How to Asana, Asana project management, youtube, excerpt, Mar. 7, 2017 https://www.youtube.com/watch?v=qqANMTvVpE (Year: 2017) (28 pages).
How to Asana, Creating your first Asana project, youtube, excerpt, Jan. 31, 2017 https://www.youtube.com/watch?v=L04WmcUdsLo (Year: 2017) (1 page).
How to Asana, Getting Asana into your workflow, youtube, excerpt, Jul. 17, 2017 https://www.youtube.com/watch?v=7YLrNMdv30 (Year: 2017) (24 pages).
How to Asana, Planning with Asana calendar, youtube excerpt, Feb. 14, 2017 https://www.youtube.com/watch?v=w8t6KYiVPyc (Year: 2017) (19 pages).
How to Asana, Using Asana for task management, youtube, excerpt, Feb. 7, 2017 https://www.youtube.com/watch?v=vwvbgiejhQ (Year: 2017) (8 pages).
How to Asana, Visualizing work with Asana kanban boards, youtube, excerpt, Feb. 21, 2017 https://www.youtube.com/watch?v=jmZaZGydfPY (Year: 2017) (41 pages).
How to Asana, Workflow management, youtube, excerpt, May 30, 2017 https://www.youtube.com/watch?v=rk8nPWmXsRo (Year: 2017) (9 pages).
How to use Advanced Search in Asana, Asana tutorial, May 25, 2016 https://www.youtube.com/watch?v=5VyJ3toPfQM (Year: 2016) (28 pages).
Justin Rosenstein, Unveiling the Future of Asana, Mar. 28, 2018 https://www.youtube.com/watch?v=nRI?d_WM4Bc (Year: 2018) (2 pages).
Lauren Labrecque, “Fostering Consumer-Brand Relationships in Social Media Environments: The Role of Parasocial Interaction”, 2014, Journal of Interactive Markeing, 28 (2014), pp. 134-148 (Year: 2014).
Macro, computer science, wikipedia, archives org, 6 pages, Feb. 11, 2020 http://web.archive.org/web/20200211082902/https://en.wikipedia.org/wiki/Macro_(computer_science) (Year: 2020).
Mauricio Aizawa, Zapier, How to Automate Asana Tasks creation using Evernote, youtube excerpts, Mar. 16, 2018 https://www.youtube.com/watch?v=BjDQ4Gny4WI (Year: 2018).
Paul Minors, How to automate your tasks, youtube excerpts, Oct. 18, 2019 https://www.youtube.com/watch?v=IwF9XyUQrzw (Year: 2019).
Prioritize My Tasks in Asana, Asana tutorial, youtube, excerpt, May 25, 2016 https://www.youtube.com/watch?v=UbCnMvw01nl (Year: 2016) (3 pages).
Project views, Asana tutorial, youtube, excerpt May 25, 2016 https://www.youtube.com/watch?v=FYjA8ZH3ceQ (Year: 2016) (5 pages).
Using Asana Premium, Asana tutorial, youtube, excerpt, Sep. 10, 2016 https://www.youtube.com/watch?v=vMgLtDDmyeo (Year: 2016) (4 pages).
Where does Asana fit in, archives org, Jul. 8, 2017 https://web.archive.org/web/20170708150928/https://asana.com/guide/resources/infosheets/where-does-asana-fit (Year: 2017) (5 pages).
Wix.com, How to Use Wix Code with Marketing Tools to Create Custom Events, Oct. 18, 2018, YouTube, https://www.youtube.com/watch?v=MTBVykOYGvO&feature=emb_title, 2 pages.
www.asana.com (as retrieved from https://web.archive.Org/web/20160101054536/https://asana.com/press and https:// web.archive.org/web/20160101054527/https://asana.com/product) (Year: 2016) 15 pages.
www.cogmotive.com/blog/author/alan Alan Byrne: “Creating a company Shared Calendar in Office 365”; pp. 1-17; Sep. 10, 2013.