Recent years have seen rapid development in hardware and software platforms for creating, managing, distributing, and collecting digital surveys across computer networks. Indeed, conventional digital survey systems can automatically generate digital survey questions for digital content providers; distribute the digital surveys to client devices (e.g., as digital notifications, webpage embeddings, or application elements); identify survey responses; and then analyze the survey responses utilizing complex computer models to generate user interfaces that include valuable insights for entity survey administrators. As these conventional digital survey systems implement digital surveys, they also generate large digital survey repositories storing large volumes of various types and formats of digital information For example, conventional digital survey systems can build voluminous repositories of digital survey data with a variety of different digital labels across a variety of different clients.
Although conventional digital survey systems can collect and analyze large data volumes, these systems face a number of technical problems with regard to accuracy, efficiency, and flexibility of operation. For example, conventional systems often struggle to accurately interpret and analyze large volumes of dynamic survey data generated from a variety of different sources. To illustrate, digital survey responses can include various different contexts, formats, and labels, which can interfere with the accuracy of computer-implemented analysis models utilized to process and transform the digital survey responses to valuable digital insights. Accordingly, conventional digital survey systems often ignore, exclude, mischaracterize, and/or misuse survey data. Further, conventional digital survey systems can generate inaccurate and incomplete survey analyses, causing client devices to migrate to alternate digital systems.
Additionally, conventional digital survey systems can also suffer from inefficiencies in processing and analyzing large volumes of survey response data. For example, in order to process digital survey data gathered from multiple sources, some conventional digital survey systems require individual administrator devices to review and tag individual digital survey responses. Such response-by-response tagging utilizes excessive time, user interactions, user interfaces, and computing resources to process large repositories of digital survey information.
In addition to accuracy and efficiency concerns, conventional digital survey systems are also rigid and inflexible. Digital survey data is dynamic and can quickly change over time. Indeed, digital survey questions can be modified in real time, large numbers of survey responses can pour in over time, and digital survey models can change based on these modifications. Moreover, digital survey repositories can reflect dynamic digital information from a variety of different surveys, corresponding to a number of different clients, and collected from a variety of different distribution channels. Nonetheless, conventional digital survey systems are inflexible and rigid in processing and utilizing large digital survey data volumes. As noted above, many conventional digital survey systems require a rigid process of applying labels to individual digital survey responses, which cannot accommodate the dynamic, real-time, and variable demands of online digital survey information. Accordingly, conventional digital survey systems are often unable to flexibly provide useful digital survey analysis utilizing different survey data types and/or entities.
These along with additional problems and issues exist with regard to conventional digital survey systems.
Embodiments of the present disclosure provide benefits and/or solve one or more of the foregoing or other problems in the art with systems, non-transitory computer-readable media, and methods for efficiently, accurately, and flexibly applying attribute definitions to digital survey data (including digital survey questions and corresponding digital survey responses) and utilizing these attribute definitions to generate digital survey analyses from large digital survey data volumes. To illustrate, the disclosed systems can intelligently suggest attribute definitions for new and/or existing digital surveys and digital survey responses to efficiently align digital surveys to a global labeling schema. Additionally, the disclosed systems can intelligently apply attribute definitions to digital survey questions and corresponding digital survey responses as they are generated and collected. As digital survey questions dynamically change (and as survey responses are received over time), the disclosed systems can flexibly apply attribute definitions to modified digital survey questions and corresponding responses. Further, the disclosed systems can provide efficient user interfaces for generating and utilizing template digital survey questions and/or assigning attribute definitions to digital surveys. In this manner, the disclosed systems can accurately apply attribute definitions that align digital survey data to a global labeling schema and improve the efficiency and flexibility of implementing computing devices in generating survey analyses.
Additional features and advantages of one or more embodiments of the present disclosure are outlined in the description which follows, and in part will be obvious from the description, or may be learned by the practice of such example embodiments.
The detailed description provides one or more embodiments with additional specificity and detail through the use of the accompanying drawings, as briefly described below.
This disclosure describes one or more embodiments of a survey attribute definition system that determines and applies attribute definitions to various types of digital survey data and utilizes these attribute definitions to generate digital survey analyses from large volumes of digital survey data. Indeed, the survey attribute definition system can accurately apply attribute definitions to align large volumes of digital survey data to an overarching global label schema, thus improving the efficiency and flexibility of implementing computing devices and resulting survey analyses. For example, the survey attribute definition system can intelligently suggest or apply attribute definitions for digital surveys and/or digital survey responses based on prior usage of attribute definitions, the text of digital surveys, characteristics of digital survey templates, or other criteria. In addition, as digital surveys change over time, the survey attribute definition system can intelligently apply attribute definitions to modified digital survey resources. As the attribute definition system identifies new digital survey responses, the attribute definition system can apply attribute definitions to align new digital survey data to the global label schema. Further, the survey attribute definition system can provide efficient user interfaces for generating template digital survey questions, assigning attribute definitions to digital surveys, and providing digital survey analyses to further improve efficiency and flexibility of implementing devices.
As mentioned, the survey attribute definition system can assign attribute definitions to digital survey data and then utilize the attribute definitions to interpret and analyze the survey data (e.g., to generate digital survey analyses such as survey dashboards). In one or more embodiments, the survey attribute definition system can generate and utilize a variety of kinds of attribute definitions. For example, the survey attribute definition system can provide a graphical user interface that allows a survey administrator to create attribute definitions. In one or more embodiments, the survey attribute definition system can utilize administrator input specifying a corresponding format and type (e.g. an enumeration type) for survey data associated with the attribute definition.
In one or more embodiments, the survey attribute definition system can organize attribute definitions into attribute tags. Specifically, the survey attribute definition system can generate attribute tags that reflect categories or classifications of attribute definitions. By generating an organization of attribute tags that reflect classifications of attribute definitions, the survey attribute definition system can more efficiently identify and apply attribute definitions to digital survey data.
To illustrate, in one or more embodiments, the survey attribute definition system provides (e.g. via an entity administrator device) a graphical user interface including selectable elements corresponding to attribute definitions. In particular, the survey attribute definition system can provide the selectable elements corresponding to the attribute definitions in an organization based on attribute tags. In one or more embodiments, the survey attribute definition system provides the selectable elements corresponding to the attribute definitions in a digital survey creation graphical user interface. Accordingly, the survey attribute definition system can apply attribute definitions to a digital survey (e.g., a digital survey question) based on user interaction with these selectable elements.
Additionally, the survey attribute definition system can apply attribute definitions to a digital survey question and/or digital survey response based on analysis of the digital survey question and/or digital survey response. For example, in some embodiments, the survey attribute definition system analyzes the text of a digital survey question utilizing a machine learning model or heuristic model to determine an appropriate attribute definition. In addition to the text of a digital survey question, the survey attribute definition system can analyze other features, such as a data input type associated with the digital survey question or metadata associated with the digital survey question to determine an appropriate attribute definition. The survey attribute definition system can suggest these determined attribute definitions in a survey creation graphical user interface or can automatically apply a determined attribute definition.
The survey attribute definition system can also apply attribute definitions to digital survey data based on utilization of a survey template. For example, in creating a digital survey, the survey attribute definition system can identify selection of a digital survey template, including a template for a digital survey question. In one or more embodiments, the survey attribute definition system identifies an attribute definition associated with the template digital survey question. Based on usage of the template, the survey attribute definition system can automatically apply the associated attribute definition to the digital survey and/or digital survey question, and to corresponding digital survey responses.
Further, in some embodiments, the survey attribute definition system determines and applies attribute definitions for modified digital surveys and/or modified digital survey questions. The survey attribute definition system can identify that a digital survey utilizes modified survey questions from prior digital surveys. Similarly, the survey attribute definition system can determine that a digital survey question in a new digital survey is similar to a previously used digital survey question. In some embodiments, the survey attribute definition system utilizes the pre-modification or previously used digital survey questions to identify an attribute definition for the modified or new digital survey question.
The survey attribute definition system can also apply attribute definitions to existing survey data, including digital surveys, digital survey questions, and/or survey responses. For example, the survey attribute definition system can apply attribute definitions to an existing digital survey database. The survey attribute definition system can utilize many of the techniques described above to determine and apply attribute definitions to existing survey data. For example, the survey attribute definition system can analyze the text of survey questions and/or corresponding survey responses to determine an attribute definition to apply to either the survey questions or survey responses. Additionally, the survey attribute definition system can utilize metadata associated with the survey data, including existing tags or definitions, to determine an attribute definition to apply. Moreover, the survey attribute definition system can provide efficient user interfaces for assigning attribute definitions to historical digital survey questions (and then apply the attribute definitions to historical survey responses).
Further, as mentioned above, the survey attribute definition system can generate digital survey analyses utilizing attribute definitions assigned to digital survey data. More specifically, the survey attribute definition system can utilize attribute definitions to identify and categorize survey data and then utilize computer-implemented models to generate insights and survey analyses reflecting the digital survey data. For example, the survey attribute definition system can utilize attribute definitions to identify and classify digital survey data across clients, surveys, and distribution channels to generate survey reports, survey dashboards, predictions, and a variety of other types of digital survey analyses.
The survey attribute definition system can provide a variety of advantages and benefits over conventional systems and methods. For example, the survey attribute definition system can improve accuracy of implementing computing devices, digital survey models, and survey analyses by identifying and applying pertinent attribute definitions to digital survey data. As mentioned above, the survey attribute definition system generate user interfaces for accurately aligning attribute definitions to new or existing digital survey questions and then automatically map these attribute definitions to resulting digital survey responses. Further, the survey attribute definition system can automatically suggest accurate attribute definitions for survey data based on various models. Moreover, the survey attribute definition system can improve accuracy of resulting models and survey analyses by utilizing these attribute definitions to identify and interpret survey data. To illustrate, applying and utilizing attribute definitions allows the survey attribute definition system to determine survey data corresponding to the same attribute even when format, source, entity, or other features of the survey data differ. This recognition enables accurate analysis across a variety of types of survey data from different sources, different survey campaigns, or different distribution channels.
Further, the survey attribute definition system improves efficiency relative to conventional systems. In particular, the survey attribute definition system can apply attribute definitions to more efficiently interpret and utilize digital survey data in generating survey analyses. For example, the survey attribute definition system can utilize efficient user interfaces for suggesting and applying attribute definitions to digital survey data. The survey attribute definition system can also automatically apply attribute definitions to digital survey questions and/or digital survey responses. Furthermore, by applying attribute definitions as part of a global label schema, the survey attribute definition systems can significantly reduce resources needed to identify and utilize survey data from large survey data repositories.
The survey attribute definition system can also improve flexibility relative to conventional systems. To illustrate, by identifying and applying attribute definitions to digital survey data, the survey attribute definition system can analyze survey data across different formats, different digital surveys, and different entities. In addition, the survey attribute definition system can more flexibly accommodate dynamic changes in digital survey data, including changes to digital survey questions, ongoing collection of digital survey responses, and updating models reflecting this modified data. Indeed, by automatically mapping attribute definitions to digital survey data, the survey attribute definition system can dynamically address changes in digital survey data to flexibly provide up-to-date and accurate survey analyses across clients, surveys, and distribution channels.
As illustrated by the foregoing discussion, the present disclosure utilizes a variety of terms to describe features and advantages of the survey attribute definition system. Additional detail is now provided regarding the meaning of such terms. For example, as used herein, the term “attribute definition” refers a label, identifier, or classification. In particular, the term “attribute definition” can include a unique universal identifier for a class or type of survey data (e.g., survey data from a variety of sources). The survey attribute definition system can apply attribute definition to digital surveys, digital survey responses, digital survey questions, and a variety of other survey data. For example, the survey attribute definition system can utilize an attribute definition “BirthYear” for digital survey questions or digital survey responses corresponding to a birth year.
As also used herein, the term “schema module” refers to a group or collection of attribute definitions. In particular, the term “schema module” can include a listing of attribute definitions that the survey attribute definition system utilizes to manage a dataset. The survey attribute definition system can utilize a schema module to apply and utilize attribute definitions for a digital survey data from different sources, including from different entities.
Additionally, as used herein, the term “digital survey” refers to a digital communication that collects information concerning one or more respondents by capturing information from (or posing questions to) such respondents. For example, a digital survey can include a set of digital survey questions or content intended for distribution over a network by way of respondent devices and further intended to collect responses to the digital survey questions for generating survey results from the collected responses. A digital survey can include one or more digital survey questions and corresponding answer choices that accompany the given question. Accordingly, a digital survey may include digital survey content, such as elements that form a digital survey such as, but not limited to, digital survey questions, survey question formats, transmission formats, or information about a respondent.
Further, as used herein, the term “digital survey question” refers to a prompt included in a digital survey that invokes a response from a respondent, or that requests information from a respondent. In one or more embodiments, when one or more answer choices are available for a digital survey question, a digital survey question may include a question portion as well as an available answer choice portion that corresponds to the survey question. For example, a digital survey question can comprise prompts such as “how was your dining experience” or “please select your favorite products.” Relatedly, as used herein, the term “template” (e.g., a survey template question or survey template) refers to a pre-defined pattern or sample. Thus, a “survey template question” refers to a pre-defined digital survey question (with a pre-defined attribute definition) that can be selected for inclusion in a digital survey. Similarly, as used herein, the term “modified digital survey question” refers to a digital survey question generated as a modification of another digital survey question.
Also, as used herein, the term “digital survey response” refers to digital information provided by a respondent device corresponding to a digital question. A digital survey response may include, but is not limited to, a selection, text input, audio input (or other user input) indicating a response (e.g., an answer) to a digital survey question. Further, in some embodiments, a digital survey response includes metadata associated with a digital survey response, including data on a corresponding digital survey question, data regarding a survey respondent, and other data about the digital survey response.
Further, as used herein, the term “digital survey analysis” refers to a digital summary, evaluation, prediction, or report based on survey data. In particular, the term “digital survey analysis” can include a variety of digital documents detailing various aspects of survey responses, including survey dashboards, survey response search results, survey reports, predictions of target features based on survey response, and a variety of other analyses. As outlined in greater detail below, the survey attribute definition system can generate a digital survey analysis utilizing attribute definitions associated with various digital surveys, digital survey responses, digital survey questions, and other digital survey data.
As used herein, the term “survey data” refers to digital information related to a digital survey. In particular, the term “digital survey data” can include a digital survey, a digital survey response, a digital survey question, digital survey analysis, or a variety of other data related to a digital survey. The survey attribute definition system can apply attribute definitions to a variety of kinds of survey data.
Additionally, as used herein, the term “survey response database” refers to a digital collection or repository of digital survey responses. In particular, the term “survey response database” can include a database of various digital survey data, including survey responses from a variety of sources and from a variety of entities. In one or more embodiments, the different entities are different organizations that administer surveys and/or collect survey responses.
Also, as used herein, the term “net promoter score classification” refers to a measure of loyalty between a provider and a consumer. In particular, the term “net promoter score classification” can include a score, term, or other category representing a strength of a relationship between an entity that administers surveys and a corresponding consumer base. For example, in response to a question of “how likely are you to recommend a product to a colleague,” a net promoter score classification can include a “promoters classification” (scoring from 9 to 10 on a scale from 0 to 10), a “passive classification” (scoring from 7 to 8) or a “detractor classification) (scoring from 0 to 6).
Additional detail will now be provided regarding the survey attribute definition system in relation to illustrative figures. For example,
As shown in
As shown, the server device(s) 102 implement a digital survey system 104 and the survey attribute definition system 106. In general, the digital survey system 104 creates, administers, and analyzes digital surveys. For example, the digital survey system 104 can interact with an administrator device, to create, modify, and run a digital survey that includes various prompts (e.g., digital survey questions). In addition, the digital survey system 104 provides survey prompts to, and collects responses from, respondent devices (e.g. via the respondent device 118). Further, as shown in
As further shown in
To illustrate, in some embodiments, the survey attribute definition system 106 can generate attribute definitions based on user interaction with graphical user interfaces provided to an administrator device (e.g. the entity administrator device 110 and/or the survey administrator device 114). Further, the survey attribute definition system 106 can generate a digital survey based on interactions received via the entity administrator device 110 and/or the survey administrator device 114. For example, the survey attribute definition system 106 can apply selected attribute definitions to digital surveys and/or digital survey questions automatically and/or based on interactions received via the entity administrator device 110 and/or the survey administrator device 114. The survey attribute definition system 106 can also collect digital survey responses (e.g. received from the respondent device 118). The survey attribute definition system 106 can apply attribute definitions to received digital survey responses based on attribute definitions applied to corresponding digital surveys and/or digital survey questions. Further, in some embodiments, the survey attribute definition system 106 analyzes historical digital survey data from a digital survey database and assigns attribute definition based on the analysis.
Although
The survey attribute definition system 106 can be implemented on a variety of computing devices. In particular, and as described above, the survey attribute definition system 106 may be implemented in whole or in part by the server device(s) 102 or the survey attribute definition system 106 may be implemented in whole or in part by the entity administrator device 110, the survey administrator device 114, and the respondent device 118. Accordingly, the survey attribute definition system 106 may be implemented across multiple devices or components.
As discussed above, the survey attribute definition system 106 can apply attribute definitions to a variety of digital survey data, including digital survey responses.
As shown in
The attribute definitions 204 can include a universally unique identifier that the survey attribute definition system 106 can apply to survey data. In one or more embodiments, the attribute definitions 204 are further associated with a category type (e.g., including enumerated categories) for tagged survey data. To illustrate, a category type for the attribute definition “TShirtSize” can include enumerated categories for sizing, such as “S,” “M,” “L,” etc. The survey attribute definition system 106 can utilize the category type of enumerated category to interpret and utilize survey data tagged by an attribute definition. Accordingly, the survey attribute definition system 106 can utilize the attribute definitions 204 to identify and interpret survey data. As will be discussed in greater detail below the survey attribute definition system 106 can utilize a variety of criteria to determine an attribute definition of the attribute definitions 204 to apply to the digital survey questions and responses 202.
As shown in
The survey attribute definition system 106 can utilize the applied attribute definitions to identify the defined digital survey questions and responses 206 from among a large volume of survey data. Additionally, in some embodiments, the survey attribute definition system 106 utilizes the applied attribute definitions to interpret the defined digital survey questions and responses 206. Thus, the survey attribute definition system 106 can utilize the defined digital survey questions and responses 206 to generate digital survey analyses that communicate the context of survey data, even across survey data types and survey data sources. For example, the survey attribute definition system 106 can utilize both the digital survey response “35” tagged with “Age” and a digital survey response from a different digital survey administered by a different entity with an age range, such as “30-39” tagged with “Age.” The survey attribute definition system 106 can identify the two digital survey responses as both corresponding to “Age.” Further, the survey attribute definition system 106 can interpret and utilize both digital survey responses together in a survey report despite their differing input types and differing sources.
As mentioned, the survey attribute definition system 106 can apply attribute definitions to a digital survey response based on an association between an attribute definition and the digital survey corresponding to the digital survey response. Additionally, in some embodiments, the survey attribute definition system 106 applies attribute definitions to individual digital survey questions of a digital survey.
As shown in
Additionally, the survey attribute definition system 106 can perform an act 304 of receiving selection of an attribute definition. To illustrate, a user device (e.g. an entity administrator device) can detect user selection of a selectable element corresponding to an attribute definition.
Further, as shown in
In some embodiments, the survey attribute definition system 106 applies the selected attribute definition to a digital survey or to a digital survey question by modifying digital metadata associated with the digital survey or digital survey question to include the attribute definition. Additionally, in some embodiments, the survey attribute definition system 106 modifies metadata to include data associated with the attribute definition, such as an enumeration type specifying a category type that includes enumerated categories for survey data associated with the attribute definition. Further, the attribute definition can include a globally unique identifier that the survey attribute definition system 106 can utilize to identify survey data corresponding to an attribute definition. As will be discussed in greater detail below with regard to
Additionally or in the alternative, the survey attribute definition system 106 can apply the attribute definition to digital survey data by storing the attribute definition separately from the survey attribute definition system 106. In such an embodiment, the survey attribute definition system 106 can modify the metadata associated with the digital survey data to include a reference to the storage location of the attribute definition.
As also shown in
Further, the survey attribute definition system 106 can determine that the digital survey and/or digital survey question corresponding to the digital survey response is associated with an attribute definition. As shown in
Similar to applying an attribute definition to a digital survey and/or a digital survey question, the survey attribute definition system 106 can apply an attribute definition to a digital survey response by modifying digital metadata associated with the digital survey response. In one or more embodiments, the survey attribute definition system 106 identifies an attribute definition in digital metadata associated with a digital survey. Further, in response to receiving a digital survey response associated with that digital survey, the survey attribute definition system 106 identifies the attribute definition from the digital metadata associated with the digital survey. Additionally, based on identifying the attribute definition in the digital metadata associated with the digital survey, the survey attribute definition system 106 applies the same attribute definition to the digital survey response.
Thus, the survey attribute definition system 106 can automatically apply a selected attribute definition to all survey responses corresponding to a digital survey and/or digital survey question. Accordingly, based on a single selection of a survey creator (e.g. a survey administrator), the survey attribute definition system 106 can apply attribute definitions to a large volume of digital survey responses. The survey attribute definition system 106 can identify and utilize these defined digital survey responses utilizing their associated attribute definitions.
As further shown in
While
In another example, the survey attribute definition system 106 can generate digital permissions requirements for digital survey resources based on associated attribute definitions. To illustrate, the survey attribute definition system 106 can provide access to digital survey results to a select group of users based on an associated attribute definition “Sensitive.” In another example, the survey attribute definition system 106 can provide access to a digital survey for users on a marketing team based on an association of the digital survey with the attribute definition “Marketing.”
The survey attribute definition system 106 can also utilize attribute definitions to determine user segmentation. For example, the survey attribute definition system 106 can generate use segmentation for digital survey respondents based on attribute definitions associated with their digital survey responses. Additionally, the survey attribute definition system 106 can drive segmentation of a variety of kinds of digital survey data.
The survey attribute definition system 106 can also generate digital survey analysis utilizing a machine learning model (e.g. a neural network). For example, the survey attribute definition system 106 can input digital survey data into a neural network to predict a target attribute or target action associated with the digital survey data. Further, the survey attribute definition system 106 can generate a digital survey report communicating the target attribute and/or target action. In some embodiments, the survey attribute definition system 106 can further generate the report including a summary of the inputted digital survey data utilized to determine the target attribute and/or target action.
The survey attribute definition system 106 can train the neural network utilizing training digital survey data and ground-truth target actions. The survey attribute definition system 106 can input the training digital survey data into an untrained neural network to generate predicted target actions. Further, the survey attribute definition system 106 can modify the neural network to minimize a loss function based on a comparison of the predicted target actions to the ground-truth target actions.
As mentioned above, the survey attribute definition system 106 can apply user-selected attribute definitions in a survey creation graphical user interface.
As further shown in
In some embodiments, the survey attribute definition system 106 determines attribute tags based on administrator input (e.g. from a survey administrator device). Further, the survey attribute definition system 106 can determine attribute definitions corresponding to each attribute tag based on administrator input. Additionally or in the alternative, the survey attribute definition system 106 can determine which attribute definitions correspond to attribute tags utilizing a machine learning model. For example, the survey attribute definition system 106 can train a neural network to categorize attribute definitions into attribute tags utilizing training and attribute definitions and ground-truth attribute tags. The survey attribute definition system 106 can input the training attribute definitions into an untrained neural network to generate predicted attribute tags. Then, the survey attribute definition system 106 can compare the predicted attribute tags to the ground-truth attribute tags and can modify the neural network accordingly. For example, the survey attribute definition system 106 can modify the neural network based on the comparison to minimize a loss function.
As also shown in
As also shown in
Further, the survey attribute definition system 106 can perform an act 408 of receiving selection of an attribute definition. Similar to the discussion above with regard to selectin of an attribute tag, the survey attribute definition system 106 can receive an indication of user input from an administrator device. Further, in some embodiments, the survey attribute definition system 106 interprets such received indication of user input as a selection of a corresponding attribute definition.
As also shown in
Additionally, the survey attribute definition system 106 can apply the selected attribute definition to digital survey responses received in response to the digital survey question. As discussed above, the survey attribute definition system 106 can identify a received digital survey response associated with a digital survey questions corresponding to an attribute definition. Based on this identification, the survey attribute definition system 106 can apply the same attribute definition to the digital survey response. For example, in
In some embodiments, the survey attribute definition system 106 can analyze digital survey questions (or digital survey responses) to determine and/or suggest attribute definitions to apply.
For example, as shown in
Additionally, the survey attribute definition system 106 can perform a search of attribute definitions corresponding to the identified keywords. In one or more embodiments, the survey attribute definition system 106 utilizes utilize keyword matching (or semantic meaning matching) to perform the search. The survey attribute definition system 106 can return attribute definitions related to the keywords. For example, the survey attribute definition system 106 can return attribute definitions with text related to one or more of the keywords, attribute definitions in attribute tags related to the keywords, associated with data related to the keywords, or having a variety of relations to the keywords.
As mentioned, the survey attribute definition system 106 can utilize natural language processing to determine attribute definitions to suggest for application to the digital survey question. For example, the survey attribute definition system 106 can utilize rule-based and/or statistical natural language processing. To illustrate, the survey attribute definition system 106 can apply a coded set of rules to a digital survey question, such as grammatical rules and/or heuristic rules for stemming. Based on application of these rules to the digital survey question, the survey attribute definition system 106 can determine a listing of attribute definitions to suggest for application to the digital survey question.
Additionally, the survey attribute definition system 106 can utilize a machine learning model for natural language processing of keywords. For example, the survey attribute definition system 106 can utilize a neural network trained utilizing a set of ground-truth keyword data. For example, such a neural network can perform terminology extraction to automatically extract relevant terms from input to the neural network. The survey attribute definition system 106 can input the digital survey question into such a neural network. Further, the survey attribute definition system 106 can utilize the returned terminology as keywords for a keyword search for attribute definitions, as described above.
Additionally, in some embodiments, the survey attribute definition system 106 utilizes string comparison to determine attribute definitions to suggest for application to a digital survey question. For example, the survey attribute definition system 106 can utilize a string comparison comparing identified keywords (e.g. by keyword extraction or natural language processing) to a pool of attribute definitions. For example, the survey attribute definition system 106 can compare values or references of the keywords with a pool of attribute definitions. In one or more embodiments, the survey attribute definition system 106 can compare values lexicographically and return attribute definitions with sufficient similarity.
The survey attribute definition system 106 can further utilize various data associated with digital surveys to determine attribute definitions. For example, in addition to utilizing the text of a digital survey question, the survey attribute definition system 106 can utilize associated metadata. For example, the survey attribute definition system 106 can analyze the name of the digital survey, a label previously applied to a digital survey question (e.g., a non-global label applied by an entity device), surrounding digital survey questions within the digital survey, and/or previous digital survey responses to the digital survey question. The survey attribute definition system 106 can also utilize these features associated with the digital survey question (e.g. as additional input for a neural network, keyword search, or other approaches described above).
Additionally, the survey attribute definition system 106 can train and utilize an attribute definition machine learning model that returns attribute definitions based on receiving digital survey questions as input. For example, the survey attribute definition system 106 can system can utilize a convolutional neural network, a recurrent neural network, or other neural network to determine an attribute definition corresponding to a digital survey question. To illustrate, the survey attribute definition system 106 can train an attribute definition neural network utilizing training digital survey questions (or other training survey data) and corresponding ground-truth attribute definitions. In some embodiments, the survey attribute definition system 106 inputs the training digital survey questions into an untrained neural network to generate predicted attribute definitions. Further, the survey attribute definition system 106 can compare the predicted attribute definitions to the ground-truth attribute definitions and can modify the neural network accordingly. The survey attribute definition system 106 can modify the neural network in order to minimize a loss function.
After the survey attribute definition system 106 trains the attribute definition neural network, the survey attribute definition system 106 can provide a digital survey question (or other features of the digital survey question) as input to the attribute definition neural network. Then, the survey attribute definition system 106 can predict one or more attribute definitions and provide the predicted attribute definitions as suggestions.
In addition to utilizing a neural network, the survey attribute definition system 106 can utilize a variety of types of machine learning models. For example, the survey attribute definition system 106 can utilize a decision tree, a support vector machine, naïve Bayes, perceptrons, or a variety of other machine learning models. More specifically, the survey attribute definition system 106 can use any of these machine learning models to determine attribute definitions to apply to various kinds of digital survey data.
Further, in one or more embodiments, the survey attribute definition system 106 can utilize the analysis methods described above to determine an attribute definition to automatically apply to existing digital survey data. For example, the survey attribute definition system 106 can analyze digital survey data in a digital survey database, including digital survey questions and digital survey responses, to determine one or more attribute definitions to apply. Rather than suggesting the attribute definitions, in some embodiments, the survey attribute definition system 106 assigns the attribute definitions to the digital survey data automatically.
For example, in addition to analyzing a digital survey question during digital survey creation, the survey attribute definition system 106 can apply one more of the above-discussed methods of analysis to an existing digital survey question in a digital survey database. However, rather than utilizing the resulting attribute definitions as suggestions in a graphical user interface, the survey attribute definition system 106 can select one or more of the resulting attribute definitions to automatically apply to the digital survey question. For example, the survey attribute definition system 106 can automatically apply the most likely or highest scoring attribute definition to the digital survey question.
In addition or in the alternative, the survey attribute definition system 106 can automatically apply any attribute definition having a confidence score or another suggestion metric that satisfies an attribute definition threshold. The survey attribute definition system 106 can set the attribute definition threshold based on the type of confidence metric utilized in a particular analysis. In some embodiments, the survey attribute definition system 106 can set or adjust the attribute definition threshold based on received administrator input (e.g. via a survey administrator or entity administrator device).
Though
For example, the survey attribute definition system 106 can automatically apply attribute definitions to existing digital survey responses in a digital survey database. The survey attribute definition system 106 can apply the above-described methods of analysis to the digital survey response. In addition, in one or more embodiments, the survey attribute definition system 106 identifies a digital survey and/or digital survey question corresponding to the digital survey response. For example, the survey attribute definition system 106 can identify the digital survey question that the digital survey response was submitted in response to. The survey attribute definition system 106 can further perform the analysis of the corresponding digital survey and/or digital survey question to determine an attribute definition to apply to the digital survey response.
Similarly, when automatically applying attribute definitions to existing digital survey data (e.g. in a digital survey database), the survey attribute definition system 106 can utilize a variety of data associated with the existing survey data. The survey attribute definition system 106 can utilize a variety of types of metadata associated with the existing survey data. In some embodiments, the survey attribute definition system 106 utilizes existing attribute definitions, dates and/or times associated with the digital survey data, labels, or various other data. For example, when applying global attribute definitions to digital survey data in a digital survey database, the survey attribute definition system 106 can utilize existing entity-specific attribute definitions (and/or other definitions or tags) as additional keywords in a keyword analysis. In addition or in the alternative, the survey attribute definition system 106 can utilize such existing attribute definitions as additional input to a machine learning-based approach.
Accordingly, the survey attribute definition system 106 can utilize the above-described analysis of various digital survey data to automatically apply attribute definitions either in creation of a digital survey or to existing digital survey data. The survey attribute definition system 106 can utilize a variety of these approaches, including in combination. Additionally, as described above, the survey attribute definition system 106 can utilize the above-described analysis to provide suggested attribute definitions to new or existing digital survey data.
As illustrated in
The survey attribute definition system 106 can provide the attribute definitions returned by one or more of the methods discussed above as attribute definitions. In some embodiments, the survey attribute definition system 106 can determine which attribute definitions are most likely. For example, the survey attribute definition system 106 can utilize a confidence score returned by a neural network, or a percent match from a search query. The survey attribute definition system 106 can provide the attribute definitions from most likely (or best match) to least likely (or worst match).
Based on receiving user selection of one of the suggested attribute definitions, the survey attribute definition system 106 can apply the selected attribute definition to the digital survey and/or digital survey question. Additionally, as discussed above, in some embodiments the survey attribute definition system 106 automatically applies the selected attribute definition to digital survey responses corresponding to the digital survey and/or digital survey. Accordingly, the survey attribute definition system 106 can automatically apply attribute definitions to digital survey responses based on a single user selection, and further based on one or more of the above-described analytical models.
Turning to
As shown in
As further shown in
In addition or in the alternative, the survey attribute definition system 106 can receive the digital survey template 614 without any association with an attribute definition. In such embodiments, the survey attribute definition system 106 can analyze the digital survey question in the digital survey template 614 using one or more of the variety of analysis methods described above with regard to
As also shown in
As discussed above, the survey attribute definition system 106 can also apply attribute definitions to digital survey data based on a modified digital survey and/or a modified digital survey question.
Further, as shown in
Further, the survey attribute definition system can determine that a digital survey question in a new digital survey is similar to a previously used digital survey question. To illustrate, even if a digital survey question is not generated as a result of modifying a digital survey question, the survey attribute definition system 106 can treat the digital survey question as a modified digital survey question by identifying sufficient similarity between the digital survey question and a previously utilized digital survey question. In some embodiments, the survey attribute definition system 106 can determine that the survey attribute definition system 106 have a threshold similarity based on natural language and/or string comparison. Then, the survey attribute definition system 106 can treat the previously used digital survey question as the prior version of the new digital survey question.
The survey attribute definition system 106 can identify a survey attribute associated with the digital survey questions of the prior digital survey before modification. For example, in
Further, the survey attribute definition system 106 can identify an attribute definition associated with the previous version of a modified digital survey question. For the example shown in
In some embodiments, the survey attribute definition system 106 utilizes both the attribute definition associated with the prior version of the digital survey question and the attribute definitions returned by analyzing the second digital survey question 716 in isolation to determine an attribute definition to apply to the second digital survey question 716. In some embodiments, the survey attribute definition system 106 selects from among the attribute definitions based on a confidence metric associated with the previously used attribute definition and each attribute definition returned in digital survey analysis of the second digital survey question 716. The survey attribute definition system 106 can assign a confidence metric to attribute definitions based on a metric assigned during digital survey analysis. The survey attribute definition system 106 can additionally assign and/or increase a confidence metric based on a manual confirmation or selection of the attribute definition.
In one or more embodiments, the survey attribute definition system 106 selects the attribute definition with the highest confidence score among the previously used attribute definition and the attribute definition returned by analyzing the second digital survey question 716 (e.g., the survey attribute definition system 106 can select the attribute definition with the highest score).
Additionally, in some embodiments, the survey attribute definition system 106 utilizes the prior attribute definition (attribute definition of a survey question prior to modification) as input to a machine learning model. For example, the survey attribute definition system 106 can provide the modified survey question and the prior attribute definition to a machine learning model and the machine learning model can predict an attribute definition based on the input. For example, the survey attribute definition system 106 can train a neural network utilizing this input data as described above with regard to
In some embodiments, the survey attribute definition system 106 selects an attribute definition to automatically apply based on the weighted combination of the attribute definitions. The survey attribute definition system 106 can automatically apply the attribute definition to the modified digital survey question. In the alternative, in some embodiments, the survey attribute definition system 106 provides the determined attribute definition in the survey modification graphical user interface for approval by a user (e.g. via an entity administrator device and/or a survey administrator device).
As further show in
As mentioned above, an attribute definition can be associated with one or more schema modules and one or more attribute tags.
Further, in one or more embodiments, the survey attribute definition system 106 can validate incoming attributes and their associated data. For example, the survey attribute definition system 106 can receive a large repository of digital survey data from an entity device. The survey attribute definition system 106 can analyze labels corresponding to the digital survey data and validate the labels with regard to attribute definitions of a label schema. If the labels do not conform to the label schema, the survey attribute definition system 106 can analyze the digital survey data to identify attribute definitions (e.g., attribute definitions to replace or supplement the existing labels).
The survey attribute definition system 106 can further utilize this additional data that the survey attribute definition system 106 can utilize to interpret survey data. For example, the attribute definition 802 may include an “Organization ID” corresponding to a string value type. The “OrganizationID” can denote an entity associated with an attribute definition. In some embodiments, some attribute definitions are associated with the survey system rather than with a particular entity. Attribute definitions associated with a particular entity may only be viewable and usable by that entity.
Additionally, in one or more embodiments, the attribute definition 802 is associated with an ID or primary key. In some embodiments, the primary key is a globally unique identifier value (e.g. a GUID). The survey attribute definition system 106 can utilize a primary key associated with the attribute definition 802 to identify the attribute definition 802 when applied to digital survey data, including digital survey responses.
The attribute definition 802 can also be associated with a “friendlyID” corresponding to a string value type. In some embodiments, the “friendlyID” is an optional identifier that utilizes standardized naming conventions (e.g. Java naming conventions). The “friendlyID” can be a readable identifier for use within particular entities. In some embodiments, the “friendlyID” may be the same as the primary key. Additionally or in the alternative, the “friendlyID” may be an entity-specific identifier associated with a primary key for recognition within a particular entity.
The attribute definition 802 can also be associated with various data points that give context about the creation or modification of the attribute definition 802. For example, the attribute definition 802 can be associated with a description having a string value type. In some embodiments, the description is an easily readable description of the attribute definition 802. In one or more embodiments, the attribute definition 802 is also associated with a “publishedAt” value having a string value type (e.g. ISO-8601 String). The “publishedAt” value can include the time and date when the attribute definition 802 was published. Similarly, the attribute definition 802 can be associated with “createdAt” and “updatedAt” values having a string value type (e.g. ISO-8601 String). The “createdAt” value can include the time and date at which the attribute definition 802 was created. The “updatedAt” value can include a time and date at which the attribute definition 802 was updated.
In some embodiments, the attribute definition 802 is also associated with a “valueType.” The “valueType” can include an assigned type of value associated with the attribute definition 802. For example, a “valueType” can include an integer, a decimal, a string, a string list, an enumeration, an enumeration list, a date format, a time format, and a variety of other types of values. The “valueType” can be an entity specific value type or a standardized value type (e.g. an ISO-8601 format).
If the “valueType” is an enumeration type or an enumeration list, the attribute definition 802 may further be associated with an “enumID.” In some embodiments, an “enumID” is a globally unique identifier (GUID). The survey attribute definition system 106 can utilize the “enumID” to interpret survey data associated with the attribute definition 802.
The attribute definition 802 can also optionally be associated with a “defaultValue” in a string format (e.g. JSON). The “defaultValue” can specify a default value associated with the attribute definition 802. When specified, the survey attribute definition system 106 can utilize a “defaultValue” to fill in a default value to eliminate missing values. This ensures that “missing” data is treated uniformly in all computations and preserves backward compatibility with existing data sets.
In one or more embodiments, the attribute definition 802 is further associated with a “validationHints” value in a string format (e.g. JSON). This optional value stores a subset of schema validation terms for a variety of formats. For example, the “validationHints” value can set a minimum or maximum value or can include a listing of acceptable values. Accordingly, the survey attribute definition system 106 can utilize attribute definitions to determine both an acceptable format for values and additional restrictions on acceptable values.
Further, the attribute definition 802 can be associated with an “authorization” value in a string format. The “authorization” value can be a single tag name that indicates an authorization rule associated with the attribute definition 802. For example, the “authorization” value can dictate a permissions level necessary to view digital survey data associated with the “authorization” value. A null value can indicate that no special permission is needed for access. Accordingly, the survey attribute definition system 106 can efficiently facilitate permissions corresponding to digital survey data.
In one or more embodiments, “authorization” value is additionally associated with a “classification” value in the format of a list of strings. The “authorization” value can include a list of tag names that apply to the attribute definition 802. The survey attribute definition system 106 can utilize these tags to identify digital survey data associated with the attribute definition 802.
Additionally, as shown in
Like the attribute definition 802, the enumeration type 804 can be associated with an “organizationID,” a primary key, an easily readable label, a “description” value, a “publishedAt” value, a “createdAt” value, and an “updatedAt” value. Each of these values correspond to the enumeration type 804 itself and may not be identical to the analogous value corresponding to the attribute definition 802.
Further, in some embodiments, the enumeration type 804 is associated with a “members” value that defines members of the enumeration type 804. In some embodiments, the “members” value includes any enumeration items that are potential members of the enumeration type 804. Accordingly, this “members” value can define appropriate values for digital survey data associated with the attribute definition 802. In some embodiments, enumeration items are ordered based on a position in a corresponding array. Such an array can be used in displays corresponding to the enumeration type 804. The “members” value may further be associated with an easily readable and understandable label.
As further shown in
For example, the schema module(s) 806 can be associated with an “organizationID,” a primary key, an easily readable label, a “description” value, a “publishedAt” value, a “createdAt” value, and an “updatedAt” value. Each of the schema module(s) 806 can be associated with each of these values. Further, these values are independent of those associated with either the attribute definition 802 or the enumeration type 804.
In some embodiments, the schema module(s) 806 are each further associated with attribute definitions. Accordingly, the schema module(s) 806 can each be associated with an “attributeDefs” value listing each included attribute definition. In some embodiments, the attribute definitions are listed in a standardized format (e.g. in JSON). The same attribute definition can be included in multiple of the schema module(s) 806.
As further shown in
Similar to the discussion above, the attribute tag(s) 808 can each be associated with “organizationID,” a primary key, an easily readable label, a “description” value, a “publishedAt” value, a “createdAt” value, and an “updatedAt” value. In one or more embodiments, these values are maintained independently of those corresponding to the attribute definition 802. Additionally, in one or more embodiments, the attribute tag(s) 808 are each further associated with an “entitlement” value in a string format. The “entitlement” value can denote an entitlement ID required to view the attribute tag(s) 808.
Similar to the schema module(s) 806, the attribute tag(s) 808 can each further be associated with a listing of attribute definitions. Accordingly, the attribute tag(s) 808 can each be associated with an “attributeDefs” value formatted as an array of strings. In some embodiments, the “attributeDefs” value contains the listing of attribute definitions tagged in a particular attribute tag of the attribute tag(s) 808.
The survey attribute definition system 106 can associate an attribute definition with an additional value specific to a particular entity (e.g. based on received input from an entity administrator device). This can include a variety of values, including those described above in association with the attribute definition 802, the enumeration type 804, the schema module(s) 806, and the attribute tag(s) 808. For instance, an entity device can choose to map an attribute definition to a custom algorithm, custom labels, or custom data types. Accordingly, the survey attribute definition system 106 can facilitate entity customization and usage while maintaining overall usage of attribute definitions across various entities.
The survey attribute definition system 106 can utilize any of these variety of additional values associated with the attribute definition 802, the enumeration type 804, the schema module(s) 806, and the attribute tag(s) 808 to identify and interpret digital survey data. Each of these values provide additional context and data for association with various survey data. Further, the survey attribute definition system 106 can utilize these values to generate digital survey analysis.
As discussed above, the survey attribute definition system 106 can generate digital survey analysis utilizing attribute definitions. More specifically, the survey attribute definition system 106 can generate a survey dashboard utilizing attribute definitions.
For example, the graphical user interface 900 includes the dashboard panel 902. The survey attribute definition system 106 generates the dashboard panel 902 to reflect customer satisfaction (e.g. CSAT). The survey attribute definition system 106 generates the dashboard panel 902 including a customer satisfaction rating of 78%. The survey attribute definition system 106 can determine this customer satisfaction rating from a variety of digital surveys corresponding to an entity. For example, the survey attribute definition system 106 can identify, from a digital survey database, digital survey results corresponding to the given entity and associated with the attribute definition “Customer Satisfaction.” Then, the survey attribute definition system 106 can interpret the identified survey results as percentages and determine a mean percentage to include in the dashboard panel 902.
The survey attribute definition system 106 can further generate the dashboard panel 902 to include comparisons to prior fiscal quarters for the entity.
Additionally, the survey attribute definition system 106 can generate the dashboard panel 902 to include a comparison of the entity to competitor entities.
However, it will be appreciated that interpreting digital survey results as percentages based on the attribute definition “Customer Satisfaction” and determining means is given by way of example. The survey attribute definition system 106 can identify and interpret survey results based on a variety of attribute definitions. Further, it will be appreciated the survey attribute definition system 106 can include comparisons based on a variety of criteria, and difference in time and entity are given by way of example.
As also shown in
For example, for the Customer Journey in dashboard panel 904, the survey attribute definition system 106 can identify digital survey responses associated with both the attribute definition “Customer Satisfaction” and the attribute definition “Order Placement” for a first category. Further, the survey attribute definition system 106 can identify digital survey responses associated with both the attribute definition “Customer Satisfaction” and the attribute definition “Shipping” for an additional category. The survey attribute definition system 106 can identify a variety of digital survey responses corresponding to various stages of customer interaction utilizing attribute definitions.
Additionally, as described above, the survey attribute definition system 106 can separate groupings into digital survey responses associated with the entity and digital survey responses associated with other entities. Further, the survey attribute definition system 106 can determine a mean for each of the variety of groups. That is, the survey attribute definition system 106 can determine a mean for each grouping corresponding to both a stage of the customer journey and either the entity or other entities in the region. Accordingly, the survey attribute definition system 106 can utilize the determined means to generate a line graph comparing the entity to other entities in the region at each stage in a customer journey. Thus, the survey attribute definition system 106 can utilize attribute definitions to generate a variety of kinds of survey analysis and a variety of visual representations of that analysis.
As further illustrated in
Additionally, the survey attribute definition system 106 can generate the dashboard panel 908 for inclusion in the graphical user interface 900. As shown in
Turning now to
For example, as shown in
Additionally, as shown in
As also shown in
The survey attribute definition system 106 can also generate the graphical user interface 1000 including an attribute definition chart including attribute definition columns 1008-1018. The attribute definition chart can include a variety of information and/or options associated with attribute definitions, and the attribute definition columns 1008-1018 are given by way of example. Additionally, though
The graphical user interface 1000 may include the directory level label column 1008 corresponding to an initial (e.g., entity-specific) label of digital survey data (e.g., labels that may not correspond to attribute definitions of a schema label). As shown in
As also shown in
Further, the graphical user interface 1000 can include the attribute definition column 1012 including a creation date corresponding to each attribute definition. Additionally, the survey attribute definition system 106 can generate the graphical user interface 1000 including the attribute definition column 1014 including a number of unique digital survey data items tagged by the attribute definition. The graphical user interface 1000 can also include the attribute definition column 1016 and 1018 providing various options corresponding to each attribute definition, including an option to edit.
In some embodiments, the survey attribute definition system 106 can generate new or updated attribute definitions based on user interaction with the attribute definition table. Additionally, in one or more embodiments, the survey attribute definition system 106 can recommend an existing attribute definition based on user input indicating a directory level label for the attribute definition.
For example, the survey attribute definition system 106 can provide the graphical user interface 1020 to an entity administrator device based on the directory level label corresponding to the row of the received user input. The graphical user interface 1020 can include global attribute definitions pre-defined by a survey administrator device. This enables generation of an attribute definition for entity use more efficiently and with fewer interactions. Further, recommending global attribute definitions for use in an entity improves consistency across entities.
As shown in
The survey attribute definition system 106 can also generate the graphical user interface 1020 including a new attribute definition button 1024. The survey attribute definition system 106 can provide a graphical user interface for generating a new attribute definition in response to receiving selection of the new attribute definition button 1024. The new attribute definition button 1024 can operate similarly or the same as the new attribute definition button 1006.
Additionally, the survey attribute definition system 106 can generate the graphical user interface 1030 including an attribute definition field 1034. The survey attribute definition system 106 can utilize user input received (e.g. via an administrator device) at the attribute definition field 1034 as a new attribute definition label corresponding to the new attribute definition. In some embodiments, survey attribute definition system 106 can utilize text typed into the attribute definition field 1034 as a label for the attribute definition.
As also shown in
Turning now to
As just mentioned, the survey attribute definition system 106 can include a graphical user interface manager 1102. The graphical user interface manager 1102 can provide and/or present various graphical user interfaces for creating, editing, managing, and taking digital surveys. For example, the graphical user interface manager 1102 can provide selectable elements corresponding to attribute definitions and/or selectable elements corresponding to attribute tags. Further, the graphical user interface manager 1102 can provide these selectable elements based on a variety of organizations of attribute definitions.
Additionally, as shown in
The survey attribute definition system 106 can also include an attribute definition application engine 1106. The attribute definition applier 1106 can determine attribute definitions for survey data, including digital surveys, digital survey questions, and digital survey responses. Additionally, the attribute definition application engine 1106 can apply attribute definitions to survey data, including digital surveys, digital survey questions, and digital survey responses. In one or more embodiments, the attribute definition application engine 1106 applies the attribute definitions by modifying the metadata associated with the survey data. Additionally, the attribute definition application engine applies attribute definition to digital survey responses based on attribute definition associated with a corresponding digital survey and/or digital survey question.
Further, the survey attribute definition system 106 can include a digital survey analysis engine 1108. The digital survey analysis engine 1108 can generate digital survey analyses utilizing attribute definitions associated with digital surveys, digital survey questions, and/or digital survey responses. In some embodiments, the digital survey analysis engine 1108 generates survey dashboards, survey reports, survey response search results, and other survey analyses based on attribute definitions. The digital survey analysis engine 1108 can utilize and interpret a variety of survey data from different sources, including from different entities.
The survey attribute definition system 106 can also include a data storage 1110. The data storage 1110 stores and accesses files, indicators, and other data for the survey attribute definition system 106 and/or for the digital survey system 104. For example, the data storage 1110 can communicate with any of the components of the computing device 1100 in order to store a variety of data types for the survey attribute definition system 106. Further, as shown in
Each of the components 1100-1122 of the survey attribute definition system 106 can include software, hardware, or both. For example, the components 1100-1122 can include one or more instructions stored on a computer-readable storage medium and executable by processors of one or more computing devices, such as a client device or server device. When executed by the one or more processors, the computer-executable instructions of the survey attribute definition system 106 can cause the computing device(s) to perform the methods described herein. Alternatively, the components 1100-1122 can include hardware, such as a special-purpose processing device to perform a certain function or group of functions. Alternatively, the components 1100-1122 of the survey attribute definition system 106 can include a combination of computer-executable instructions and hardware.
Furthermore, the components 1100-1122 of the survey attribute definition system 106 may, for example, be implemented as one or more operating systems, as one or more stand-alone applications, as one or more modules of an application, as one or more plug-ins, as one or more library functions or functions that may be called by other applications, and/or as a cloud-computing model. Thus, the components 1100-1122 may be implemented as a stand-alone application, such as a desktop or mobile application. Furthermore, the components 1100-1122 may be implemented as one or more web-based applications hosted on a remote server. The components 1100-1122 may also be implemented in a suite of mobile device applications or “apps.”
As mentioned,
As shown in
Additionally, the series of acts 1200 includes an act 1204 for applying an attribute definition of an organization of attribute definitions to a digital survey. In particular, the act 1204 can include applying, based on a user interaction with a selectable element, an attribute definition of an organization of attribute definitions to a digital survey. Specifically, the act 1204 can include generating an additional digital survey comprising a modified digital survey question from the digital survey question of the digital survey the digital survey, in response to identifying a correspondence between the digital survey question of the digital survey and the modified digital survey question of the additional digital survey: applying the attribute definition to the modified digital survey question, and applying the attribute definition to an additional digital survey response associated with the modified digital survey question. Additionally, the act 1204 can include can include wherein the digital survey comprises a survey template question corresponding to the attribute definition, and further comprising, in response to identifying selection of the survey template question, applying the attribute definition to the digital survey, and in response to determining that the digital survey response corresponds to the survey template question, applying the attribute definition to the survey response.
Further, the series of acts 1200 includes an act 1206 for identifying a digital survey response corresponding to the digital survey. In particular, the act 1206 can include identifying a digital survey response corresponding to the digital survey or to a digital survey question corresponding to the applied attribute definition.
Also, the series of acts 1200 includes an act 1208 for applying the attribute definition to the digital survey response of the digital survey. In particular, the act 1208 can include applying the attribute definition to the digital survey response based on the digital survey response corresponding to the digital survey. Specifically, the act 1208 can include receiving user selection of a digital survey question of the digital survey, applying, based on the user selection of the digital survey question and the user interaction with the selectable element, the attribute definition to the digital survey question of the digital survey, and applying the attribute definition to the digital survey response comprises applying the attribute definition based on determining that the digital survey response corresponds to survey input for the digital survey question associated with the attribute definition.
The series of acts 1200 also includes an act 1210 for generating a digital survey analysis based on the digital survey response and the attribute definition. In particular, the act 1210 can include generating a digital survey analysis based on the digital survey response and the attribute definition utilizing an analytical model. Specifically, the act 1210 can include wherein the attribute definition comprises a net promoter score classification, and further comprising applying the net promoter score classification to the digital survey response.
Additionally, in some embodiments, the series of acts 1200 includes identifying a survey response database comprising a second digital survey response not associated with the digital survey, applying, based on an attribute of the second digital survey response, the attribute definition to the second digital survey response, and based on determining that the attribute definition applies to the second survey response, generating the digital survey analysis further based on the second digital survey response.
Embodiments of the present disclosure may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments within the scope of the present disclosure also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. In particular, one or more of the processes described herein may be implemented at least in part as instructions embodied in a non-transitory computer-readable medium and executable by one or more computing devices (e.g., any of the media content access devices described herein). In general, a processor (e.g., a microprocessor) receives instructions, from a non-transitory computer-readable medium, (e.g., memory), and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein.
Computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are non-transitory computer-readable storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the disclosure can comprise at least two distinctly different kinds of computer-readable media: non-transitory computer-readable storage media (devices) and transmission media.
Non-transitory computer-readable storage media (devices) includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to non-transitory computer-readable storage media (devices) (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media (devices) at a computer system. Thus, it should be understood that non-transitory computer-readable storage media (devices) can be included in computer system components that also (or even primarily) utilize transmission media.
Computer-executable instructions comprise, for example, instructions and data which, when executed by a processor, cause a general-purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. In some embodiments, computer-executable instructions are executed by a general-purpose computer to turn the general-purpose computer into a special purpose computer implementing elements of the disclosure. The computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
Those skilled in the art will appreciate that the disclosure may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, and the like. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
Embodiments of the present disclosure can also be implemented in cloud computing environments. As used herein, the term “cloud computing” refers to a model for enabling on-demand network access to a shared pool of configurable computing resources. For example, cloud computing can be employed in the marketplace to offer ubiquitous and convenient on-demand access to the shared pool of configurable computing resources. The shared pool of configurable computing resources can be rapidly provisioned via virtualization and released with low management effort or service provider interaction, and then scaled accordingly.
A cloud-computing model can be composed of various characteristics such as, for example, on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, and so forth. A cloud-computing model can also expose various service models, such as, for example, Software as a Service (“SaaS”), Platform as a Service (“PaaS”), and Infrastructure as a Service (“IaaS”). A cloud-computing model can also be deployed using different deployment models such as private cloud, community cloud, public cloud, hybrid cloud, and so forth. In addition, as used herein, the term “cloud-computing environment” refers to an environment in which cloud computing is employed.
As shown in
In particular embodiments, the processor(s) 1302 includes hardware for executing instructions, such as those making up a computer program. As an example, and not by way of limitation, to execute instructions, the processor(s) 1302 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 1304, or a storage device 1306 and decode and execute them.
The computing device 1300 includes memory 1304, which is coupled to the processor(s) 1302. The memory 1304 may be used for storing data, metadata, and programs for execution by the processor(s). The memory 1304 may include one or more of volatile and non-volatile memories, such as Random-Access Memory (“RAM”), Read-Only Memory (“ROM”), a solid-state disk (“SSD”), Flash, Phase Change Memory (“PCM”), or other types of data storage. The memory 1304 may be internal or distributed memory.
The computing device 1300 includes a storage device 1306 includes storage for storing data or instructions. As an example, and not by way of limitation, the storage device 1306 can include a non-transitory storage medium described above. The storage device 1306 may include a hard disk drive (HDD), flash memory, a Universal Serial Bus (USB) drive or a combination these or other storage devices.
As shown, the computing device 1300 includes one or more I/O interfaces 1308, which are provided to allow a user to provide input to (such as user strokes), receive output from, and otherwise transfer data to and from the computing device 1300. These I/O interfaces 1308 may include a mouse, keypad or a keyboard, a touch screen, camera, optical scanner, network interface, modem, other known I/O devices or a combination of such I/O interfaces 1308. The touch screen may be activated with a stylus or a finger.
The I/O interfaces 1308 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen), one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers. In certain embodiments, I/O interfaces 1308 are configured to provide graphical data to a display for presentation to a user. The graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.
The computing device 1300 can further include a communication interface 1310. The communication interface 1310 can include hardware, software, or both. The communication interface 1310 provides one or more interfaces for communication (such as, for example, packet-based communication) between the computing device and one or more other computing devices or one or more networks. As an example, and not by way of limitation, communication interface 1310 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI.
Additionally, or alternatively, the communication interface 1220 may facilitate communications with an ad hoc network, a personal area network (“PAN”), a local area network (“LAN”), a wide area network (“WAN”), a metropolitan area network (“MAN”), or one or more portions of the Internet or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. As an example, the communication interface 1220 may facilitate communications with a wireless PAN (“WPAN”) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (“GSM”) network), or other suitable wireless network or a combination thereof.
Additionally, the communication interface 1220 may facilitate communications various communication protocols. Examples of communication protocols that may be used include, but are not limited to, data transmission media, communications devices, Transmission Control Protocol (“TCP”), Internet Protocol (“IP”), File Transfer Protocol (“FTP”), Telnet, Hypertext Transfer Protocol (“HTTP”), Hypertext Transfer Protocol Secure (“HTTPS”), Session Initiation Protocol (“SIP”), Simple Object Access Protocol (“SOAP”), Extensible Markup Language (“XML”) and variations thereof, Simple Mail Transfer Protocol (“SMTP”), Real-Time Transport Protocol (“RTP”), User Datagram Protocol (“UDP”), Global System for Mobile Communications (“GSM”) technologies, Code Division Multiple Access (“CDMA”) technologies, Time Division Multiple Access (“TDMA”) technologies, Short Message Service (“SMS”), Multimedia Message Service (“MMS”), radio frequency (“RF”) signaling technologies, Long Term Evolution (“LTE”) technologies, wireless communication technologies, in-band and out-of-band signaling technologies, and other suitable communications networks and technologies.
The communication infrastructure 1312 may include hardware, software, or both that couples components of the computing device 1300 to each other. As an example and not by way of limitation, the communication infrastructure 1312 may include an Accelerated Graphics Port (“AGP”) or other graphics bus, an Enhanced Industry Standard Architecture (“EISA”) bus, a front-side bus (“FSB”), a HYPERTRANSPORT (“HT”) interconnect, an Industry Standard Architecture (“ISA”) bus, an INFINIBAND interconnect, a low-pin-count (“LPC”) bus, a memory bus, a Micro Channel Architecture (“MCA”) bus, a Peripheral Component Interconnect (“PCI”) bus, a PCI-Express (“PCIe”) bus, a serial advanced technology attachment (“SATA”) bus, a Video Electronics Standards Association local (“VLB”) bus, or another suitable bus or a combination thereof.
This disclosure contemplates any suitable network 1406. As an example and not by way of limitation, one or more portions of network 1406 may include an ad hoc network, an intranet, an extranet, a virtual private network (“VPN”), a local area network (“LAN”), a wireless LAN (“WLAN”), a wide area network (“WAN”), a wireless WAN (“WWAN”), a metropolitan area network (“MAN”), a portion of the Internet, a portion of the Public Switched Telephone Network (“PSTN”), a cellular telephone network, or a combination of two or more of these. Network 1406 may include one or more networks 1406.
Links may connect client system 1408, and digital content survey system 1404 to network 1406 or to each other. This disclosure contemplates any suitable links. In particular embodiments, one or more links include one or more wireline (such as for example Digital Subscriber Line (“DSL”) or Data Over Cable Service Interface Specification (“DOCSIS”)), wireless (such as for example Wi-Fi or Worldwide Interoperability for Microwave Access (“WiMAX”)), or optical (such as for example Synchronous Optical Network (SONET) or Synchronous Digital Hierarchy (“SDH”)) links. In particular embodiments, one or more links each include an ad hoc network, an intranet, an extranet, a VPN, a LAN, a WLAN, a WAN, a WWAN, a MAN, a portion of the Internet, a portion of the PSTN, a cellular technology-based network, a satellite communications technology-based network, another link, or a combination of two or more such links. Links need not necessarily be the same throughout network environment 1400. One or more first links may differ in one or more respects from one or more second links.
In particular embodiments, client system 1408 may be an electronic device including hardware, software, or embedded logic components or a combination of two or more such components and capable of carrying out the appropriate functionalities implemented or supported by client system 1408. As an example and not by way of limitation, a client system 1408 may include any of the computing devices discussed above in relation to
In particular embodiments, client system 1408 may include a web browser, such as MICROSOFT INTERNET EXPLORER, GOOGLE CHROME, or MOZILLA FIREFOX, and may have one or more add-ons, plug-ins, or other extensions, such as TOOLBAR or YAHOO TOOLBAR. A user at client system 1408 may enter a Uniform Resource Locator (“URL”) or other address directing the web browser to a particular server (such as server, or a server associated with a third-party system), and the web browser may generate a Hyper Text Transfer Protocol (“HTTP”) request and communicate the HTTP request to server. The server may accept the HTTP request and communicate to client system 1408 one or more Hyper Text Markup Language (“HTML”) files responsive to the HTTP request. Client system 1408 may render a webpage based on the HTML files from the server for presentation to the user. This disclosure contemplates any suitable webpage files. As an example and not by way of limitation, webpages may render from HTML files, Extensible Hyper Text Markup Language (“XHTML”) files, or Extensible Markup Language (“XML”) files, according to particular needs. Such pages may also execute scripts such as, for example and without limitation, those written in JAVASCRIPT, JAVA, MICROSOFT SILVERLIGHT, combinations of markup language and scripts such as AJAX (Asynchronous JAVASCRIPT and XML), and the like. Herein, reference to a webpage encompasses one or more corresponding webpage files (which a browser may use to render the webpage) and vice versa, where appropriate.
In particular embodiments, digital content survey system 1404 may include a variety of servers, sub-systems, programs, modules, logs, and data stores. In particular embodiments, digital content survey system 1404 may include one or more of the following: a web server, action logger, API-request server, relevance-and-ranking engine, content-object classifier, notification controller, action log, third-party-content-object-exposure log, inference module, authorization/privacy server, search module, advertisement-targeting module, user-interface module, user-profile store, connection store, third-party content store, or location store. Digital content survey system 1404 may also include suitable components such as network interfaces, security mechanisms, load balancers, failover servers, management-and-network-operations consoles, other suitable components, or any suitable combination thereof.
In particular embodiments, digital content survey system 1404 may include one or more user-profile stores for storing user profiles. A user profile may include, for example, biographic information, demographic information, behavioral information, social information, or other types of descriptive information, such as work experience, educational history, hobbies or preferences, interests, affinities, or location. Interest information may include interests related to one or more categories. Categories may be general or specific. Additionally, a user profile may include financial and billing information of users (e.g., respondent device 118, customers).
The foregoing specification is described with reference to specific exemplary embodiments thereof. Various embodiments and aspects of the disclosure are described with reference to details discussed herein, and the accompanying drawings illustrate the various embodiments. The description above and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding of various embodiments.
The additional or alternative embodiments may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Number | Name | Date | Kind |
---|---|---|---|
10715668 | Jayapalan et al. | Jul 2020 | B1 |
20160352900 | Bell | Dec 2016 | A1 |
20190066136 | Kopikare | Feb 2019 | A1 |
20190164182 | Abdullah | May 2019 | A1 |
20190188753 | McConnell | Jun 2019 | A1 |
20190347668 | Williams | Nov 2019 | A1 |
20200019561 | Doyle | Jan 2020 | A1 |
20200202369 | Datta | Jun 2020 | A1 |
20200364245 | Sinha et al. | Nov 2020 | A1 |
20200402082 | Votava | Dec 2020 | A1 |
20210081759 | Zhao | Mar 2021 | A1 |
20210090103 | Deshmukh | Mar 2021 | A1 |
20210182282 | Silverstein | Jun 2021 | A1 |
Entry |
---|
Galera Cluster for MYSQL;Date downloaded Aug. 19, 2020; https://galeracluster.com. |
U.S. Appl. No. 17/393,032, Mail Date Aug. 15, 2024, Office Action. |
Number | Date | Country | |
---|---|---|---|
20220020039 A1 | Jan 2022 | US |