This patent arises from Indian Provisional Patent Application Serial No. 202011033521, which was filed on Aug. 5, 2020. Indian Provisional Patent Application No. 202011033521 is hereby incorporated herein by reference in its entirety. Priority to Indian Provisional Patent Application No. 202011033521 is hereby claimed.
This disclosure relates generally to computer systems and, more particularly, to computer-based personalized data classification and execution.
Manufacturers of Consumer-Packaged Goods (CPG) often hire data collectors to study display characteristics and/or prices of their products in retail stores in a particular geographic location. In some cases, the data collectors are hired auditors, store employees, or independent that accept or reject work orders sent through manual processes by the CPG manufacturers or a consumer research entity. The work orders may involve instructions or tasks to research pricing, interview customers and employees, and/or collect images.
The figures are not to scale. In general, the same reference numbers will be used throughout the drawing(s) and accompanying written description to refer to the same or like parts.
Unless specifically stated otherwise, descriptors such as “first,” “second,” “third,” etc. are used herein without imputing or otherwise indicating any meaning of priority, physical order, arrangement in a list, and/or ordering in any way, but are merely used as labels and/or arbitrary names to distinguish elements for ease of understanding the disclosed examples. In some examples, the descriptor “first” may be used to refer to an element in the detailed description, while the same element may be referred to in a claim with a different descriptor such as “second” or “third.” In such instances, it should be understood that such descriptors are used merely for identifying those elements distinctly that might, for example, otherwise share a same name.
Retailers, manufacturers, and/or consumer research entities collect data about products and/or services such as product placement in retail stores, advertisement placement, pricing, inventory, retail establishment layout, shopper traffic, vehicle traffic, etc. To request collection of such data, entities can generate task requests and hire resources (e.g., auditors) to serve as data collectors to collect such data in accordance with data collection descriptions in the task requests. Example task requests can request data collection via one or more of capturing photographs, logging data (e.g., in spreadsheets, tables, and/or other data structures), writing descriptions, answering questionnaires, etc. corresponding to product placement, advertisement placement, pricing, inventory, retail establishment layout, shopper traffic, vehicle traffic, etc. Such different types of data collection are becoming increasingly technical and can require different skills and/or data collection equipment (e.g., technologies capable of collecting and processing quantities of data beyond which is capable through human effort alone) such as a drone.
Examples disclosed herein include systems, methods, and apparatus to classify data collectors, interact with data collectors, learn data collector interests and skills based on regular interaction with the data collectors, provide training and assistance to data collectors, and/or assign tasks to data collectors based on the interests and/or skills of the data collector. As used herein, a data collector is a human that is hired, contracted, employed, and/or otherwise provides services to accept work orders for performing one or more tasks involving collecting data for use in the field of consumer research. Different skills, experiences, and interests may make some data collectors better suited for some types of tasks than others. For example, a task involving research of packaging or displays may require a higher level of photography skill than a task involving pricing research. Additionally or alternatively, a CPG client may implement requirements for hiring data collectors. For example, a CPG client may require a data collector to have a performance rating above a certain threshold for the CPG client to consider the data collector for a task. While the data collectors disclosed herein include humans, example systems, methods, apparatus, and articles of manufacture disclosed herein disclose technological solutions to improve data collector analysis, management, and allocation.
Prior techniques for processing work orders include manually recruiting data collectors, manually training data collectors, manually gathering information from data collectors, and manually assigning tasks to data collectors. Such prior techniques typically send work orders to any data collectors known in a particular location, regardless of skills, interests, or prior performance. Such manual techniques include discretionary choices by, for example, management personnel. These discretionary choices are based on “gut feel” or anecdotal experiences of the management personnel and, as such, result in inconsistencies in collected data, inefficient training, and allocation of data collectors. Furthermore, in the event selected data collectors fail to have a qualified skill sets for a work order, resources and money are wasted.
Examples disclosed herein provide substantially automated classification, training, assistance, and task assignment to data collectors by processing input data received from a digital personalized user agent associated with the data collector, assigning tasks to the data collector based on the processed input data, and providing training and/or assistance to the data collector. Examples disclosed herein eliminate the discretionary choices by humans and, thus, improve data collection efficiency and reduce errors in collected data. As a result of reducing data error, examples disclosed herein reduce computational efforts to correct erroneous data, reduce bandwidth resources that transmit and/or receive erroneous data to ultimately reduce computational waste associated with data collection.
Example input data includes data collector characteristics such as skills, skill levels, performance ratings, location, device information, and/or interests in performing particular tasks. In examples disclosed herein, the data collector characteristics are used to classify data collectors using machine learning. In some examples, a data collector is associated with a particular class based on data collector characteristics. For example, a data collector having a high photography skill level may be included in a class associated with a high photography skill level. In some examples disclosed herein, the data collector is selected from the class for a task request based on a matching characteristic and/or requirement of the task request. For example, if a task request requires a high photography skill level, then a data collector may be chosen from the class associated with a high photography skill level.
As data collector characteristics change over time, the personalized user agent associated with a data collector may use machine learning to dynamically process input data provided by the data collector, learn data collector characteristics based on the processed input data from the data collector, provide training content and guidance to the data collector, predict the behavior of a data collector based on the processed input data from the data collector, and/or accept or reject tasks based on the learned data collector characteristics. The personalized user agent may also learn and associate scores with data collectors based on skills, interests, and/or a performance rating in executing a work order with specific characteristics. The personalized user agent may update characteristics of the data collector (e.g., skills, skill level, or interests) based on completion of tasks and/or completion of training modules.
Artificial intelligence (AI), including machine learning (ML), deep learning (DL), and/or other artificial machine-driven logic, enables machines (e.g., computers, logic circuits, etc.) to use a model to process input data to generate an output based on patterns and/or associations previously learned by the model via a training process. For instance, the model may be trained with data to recognize patterns and/or associations and follow such patterns and/or associations when processing input data such that other input(s) result in output(s) consistent with the recognized patterns and/or associations. Additionally, AI techniques and/or technologies employed herein recognize patterns that cannot be considered by manual human iterative techniques.
Many different types of machine learning models and/or machine learning architectures exist. In examples disclosed herein, a classification model is used. Using a classification model enables a classification agent to classify data collectors based on personal attributes such as skill, performance rating, interests, and location and use these classifications to assign the data collectors to a task they are best suited for. In general, supervised learning is a machine learning model/architecture that is suitable to use in the examples disclosed herein. However, other types of machine learning models could additionally or alternatively be used, such as unsupervised learning, reinforcement learning, etc.
In general, implementing a machine learning/artificial intelligence (ML/AI) system involves two phases, a learning/training phase, and an inference phase. In the learning/training phase, a training algorithm is used to train a model to operate in accordance with patterns and/or associations based on, for example, training data. In general, the model includes internal parameters that guide how input data is transformed into output data, such as through a series of nodes and connections within the model to transform input data into output data. Additionally, hyperparameters are used as part of the training process to control how the learning is performed (e.g., a learning rate, a number of layers to be used in the machine learning model, etc.). Hyperparameters are defined to be training parameters that are determined prior to initiating the training process.
Different types of training may be performed based on the type of ML/AI model and/or the expected output. For example, supervised training uses inputs and corresponding expected (e.g., labeled) outputs to select parameters (e.g., by iterating over combinations of select parameters) for the ML/AI model that reduce model error. As used herein, labelling refers to an expected output of the machine learning model (e.g., a classification, an expected output value, etc.) Alternatively, unsupervised training (e.g., used in deep learning, a subset of machine learning, etc.) involves inferring patterns from inputs to select parameters for the ML/AI model (e.g., without the benefit of expected (e.g., labeled) outputs).
In examples disclosed herein, ML/AI models are trained using a nearest-neighbor algorithm. However, any other training algorithm may additionally or alternatively be used. In examples disclosed herein, training is performed at on-premise servers using hyperparameters that control how the learning is performed (e.g., a learning rate, a number of layers to be used in the machine learning model, etc.).
In examples disclosed herein, training is performed using training data. In some examples, the training data is labeled. In some examples, the training data originates from personalized user agents (e.g., personal agents) associated with data collectors. In some examples, the training data includes data collector characteristics of data collectors in a training group. For example, a training group of data collectors may provide data collector characteristics such as interests, skills, skill levels, geographic location, device information, and other information useful for assigning tasks. In some examples, a training algorithm is used to train a classification model to operate in accordance with patterns and/or associations based on, for example, the data collector characteristics provided by the training group. Once training is complete, the model is deployed for use as an executable construct that processes an input and provides an output based on the network of nodes and connections defined in the model. In some examples disclosed herein, the model is stored in a model data store and may then be executed by the model executor.
Once trained, the deployed model may be operated in an inference phase to process data. In the inference phase, data to be analyzed (e.g., live data) is input to the model, and the model executes to create an output. This inference phase can be thought of as the ML/AI “thinking” to generate the output based on what it learned from the training (e.g., by executing the model to apply the learned patterns and/or associations to the live data). In some examples, input data undergoes pre-processing (e.g., parsing) before being used as an input to the machine learning model. Moreover, in some examples, the output data may undergo post-processing after it is generated by the ML/AI model to transform the output into a useful result (e.g., a display of data, an instruction to be executed by a machine, etc.).
In some examples, output of the deployed model may be captured and provided as feedback. An accuracy of the deployed model can be determined by analyzing the feedback. If the feedback indicates that the accuracy of the deployed model is less than a threshold or other criterion, training of an updated model can be triggered using the feedback and an updated training data set, hyperparameters, etc., to generate an updated, deployed model.
In some examples, a system for assigning tasks to a user (e.g., a data collector) based on characteristics associated with the user includes a personalized user agent associated with the user to collect data from the user, receive input from the user, and learn user behavior based on the collected data and user input, a help desk agent to receive user information and requests from the personalized user agent and provide training, guidance, troubleshooting, and/or technical assistance to the user, a classification agent to receive data and information from the personalized user agent, classify the data and information using a machine learning model, and assign tasks to the user via a distribution agent, and a distribution agent to receive one or more user identifiers corresponding to one or more users suited for a particular task and submit a work order to the personalized user agent(s) associated with the user(s) for the user(s) or their personalized user agents to accept or reject.
The example data collector 110 illustrated in
As previously defined, a data collector (e.g., the data collector 110), as used herein, is a human being that is hired, contracted, employed, and/or otherwise provides services to accept work orders for performing one or more tasks involving collecting data for use in the technical field of consumer research. A data collector is associated with a personalized user agent that the data collector uses to accept or reject work orders and receive assignments, updates, training, and/or technical help.
The personalized user agent 120 illustrated in
In some examples, the personalized user agent 120 receives a work order from the distribution system 150, displays the work order to the data collector 110, and prompts the data collector 110 to accept or reject the work order. In some examples, the personalized user agent 120 receives an acceptance or rejection selection from the data collector 110 via a user input interface and transmits the selection to the distribution system 150 and the classification system 140. In some examples, the personalized user agent 120 accepts or rejects the work order automatically (e.g., without user input) based on learned data collector characteristics (e.g., the data collector is not qualified to complete the work order, etc.). In some examples, the personalized user agent 120 receives information from the classification system 140. In some examples, the personalized user agent 120 transmits queries to the help desk system 130 and receives a response to the query from the help desk system 130. In some examples, the personalized user agent 120 receives a request from the data collector 110 and transmits the request to the help desk system 130. For example, the data collector 110 may request guidance with a technical problem such as troubleshooting an application, correctly taking a picture, or any other technical issue that may arise using the personalized user agent 120. In some examples, the personalized user agent 120 receives information such as updates, training, tutorials, troubleshooting information, information for image collection, and/or other technical information from the help desk system 130.
In some examples, the help desk system 130 illustrated in
In some examples, the help desk system 130 receives user information (e.g., skills, interests, location, skill level, performance ratings, and/or device information) or a request from the personalized user agent 120, identifies training content, tutorials, and/or other guidance for the user based on the user information, and provides the training content, tutorials, and/or other guidance to the personalized user agent 120 in response to the user information or request. In some examples, the help desk system 130 identifies an area of improvement (e.g., weaknesses or deficiencies) in the user's skillset based on the user information and provides customized training content to the personalized user agent 120 of the user. For example, in response to determining that a user has limited experience taking photos with a smartphone, the help desk system 130 provides photography training and/or tutorials to the user to assist the user in developing and improving their image collection skills. In some examples, the help desk system 130 receives device information from the personalized user agent 120 and customizes the tutorial to the particular device. For example, in response to determining that the data collector 110 has an iPhone 11, the help desk system 130 may provide training content for taking images on an iPhone to the personalized user agent 120.
In some examples, in response to determining that a user (e.g., the data collector 110) has a poor photography performance rating and/or a low quality rating for a particular skill, the help desk system 130 provides the user with training and/or tutorials to assist the user in improving that skill. For example, in response to determining that the user has a poor photography performance rating, the help desk system 130 may provide the user with image collection training and/or tutorials. In some examples, the help desk system 130 provides training and/or tutorials for a particular skill in response to determining that the user does not have the skill but has an interest in performing tasks requiring the skill. In some examples, the training evolves as the user's skill, experience, and interests evolve. For example, as a user advances a skill level, the training content may become more advanced and/or may change to address a known weakness in a skill level.
In some examples, the help desk system 130 evaluates a user's work product (e.g., photos, written descriptions, data entries, or other work product collected for a task) and identifies areas of improvement based on a determined quality of the work product. For example, the help desk system 130 may analyze a photo taken by the user for a task, calculate a quality score for the image, and determine whether to provide the user with image collection training based on the quality score. In some examples, the help desk system 130 calculates a score for one or more characteristics of a photo taken by the user for a task. For example, the help desk system 130 may calculate a score for positioning, alignment, lighting, blur, overall clarity, or other characteristic of the image (e.g., an image of a product, a display, a price tag, or other object). In some examples, the help desk system 130 compares the characteristic score to a threshold value to determine whether to provide the user with training content. In some examples, the help desk system 130 identifies an area of improvement based on the characteristic score(s). For example, the help desk system 130 may evaluate a photo taken by a user, calculate an alignment score (e.g., determine how well an object is aligned in the image), compare the alignment score to a threshold value, determine the alignment score is less than the threshold value, and provide alignment guidance, training modules, and/or tutorials to the user.
In some examples, the help desk system 130 identifies an area of improvement and provides guidance to the user while the user is performing a task involving the area of improvement. For example, if the help desk system 130 determines the user has a low alignment score, the help desk system may identify photo alignment as an area of improvement and assist the user in taking a photo by enabling photo assist features (e.g., object detection, guide boxes, and/or other photo assist features) in an application and/or on the device camera.
In some examples, the help desk system 130 updates and/or prompts the personalized user agent 120 to update a user skill and/or a user skill level in response to determining the user has completed a training module or tutorial. In some examples, the help desk system 130 provides an indication to the classification system 140 that the user has completed a training module or tutorial. In some examples, the classification system 140 updates a classification of the user based on the indication from the help desk system 130 that the user completed training content, added a skill, and/or increased in skill level. For example, in response to receiving an indication that the user has completed photography training, the classification system 140 may associate the user with a class having photography skills or an improved level of photography skills compared to before completing the training.
In some examples, the help desk system 130 provides updates, troubleshooting, information related to image collection, and/or other technical information to the personalized user agent 120. In some examples, the help desk system 130 provides updates, troubleshooting, image collection information, and/or other technical information to the personalized user agent 120 in response to a request from the personalized user agent 120. In some examples, the help desk system 130 accesses a data collector characteristic such as location information, device information, and/or other information from the personalized user agent 120. For example, the example help desk system 130 may access information about a camera of a device associated with the example personalized user agent 120 (e.g., resolution, pixel size, and/or optical or digital zoom) and/or a software version of the device and/or a particular application (e.g., a data collection application). In some examples, the help desk system 130 assists with a request from the personalized user agent 120 based on the accessed information. For example, the help desk system 130 may provide the user with tutorials and/or guidance for taking photos with the camera in response to determining that the device camera has poor specifications. In some examples, the help desk system 130 prompts the personalized user agent 120 to update one or more applications in response to determining that the personalized user agent 120 has an out-of-date version of the application(s).
In some examples, the help desk system 130 receives information from the classification system 140. In some examples, the help desk system 130 receives classification information from the classification system 140. In some examples, the help desk system 130 associates training content, tutorials, guidance, and/or other content with a class and provides the training, tutorials, guidance, and/or other content to the personalized user agent 120 of a user associated with the class. For example, the help desk system 130 may associate image collection training with a class having limited photography skills and/or a class having devices with poor camera specifications and provide photography training, tutorials, guidance, and/or other content to an example personalized user agent 120 of a user within the class. In some examples, the help desk system 130 notifies the classification system 140 that a user has completed training content, added a skill, and/or increased a skill level, and, in response to the notification, the classification system 140 may update a classification based on the completed training content, added skill, and/or increased skill level. For example, in response to receiving a notification from the help desk system 130 that the user completed photography training, added a photography skill, and/or increased a photography skill level, the classification system 140 may associate the user with a class having photography skills when subsequent tasks are assigned.
The classification system 140 illustrated in
In some examples, the classification system 140 assigns the data collector 110 to a class based on data received from the personalized user agent 120 of the data collector 110, selects the data collector 110 from the class in response to a task request, and transmits identifying information associated with the data collector 110 to the distribution system 150. In some examples, the classification system 140 receives an indication of acceptance or rejection of a work order from the distribution system 150, stores the indication of acceptance or rejection in memory, and/or updates the information associated with the data collector 110 based on the indication of acceptance or rejection. For example, the classification system 140 may update a classification model based on the indication of acceptance or rejection. In some examples, the classification system 140 receives information from the help desk system 130, such as device information and/or specifications of the personalized user agent 120 associated with the data collector 110 or other information associated with the data collector 110.
The example distribution system 150 illustrated in
In some examples, the classification system 140 updates a data collector characteristic of the data collector 110 based on an indication of acceptance or rejection of the work order. For example, if the personalized user agent 120 rejects a work order for a retail task, the classification system 140 may update an interest characteristic of the data collector 110 to reflect that the data collector 110 may not have an interest in performing retail tasks. Accordingly, the classification system 140 may be less likely to choose the data collector 110 for a retail task in the future. In some examples, the classification system 140 updates a class associated with the data collector 110 based on acceptance or rejection of a task. For example, if the classification system 140 receives a rejection from the personalized user agent 120 for a retail task, the classification system 140 may remove the data collector 110 from a class of data collectors having an interest in retail tasks.
The personalized user agent 120, the help desk system 130, the classification system 140, and the distribution system 150 may be arranged to communicate with multiple other user devices, help desk systems, classification systems, distribution systems and/or other systems not described herein.
As shown in the example diagram of
As shown in the example diagram of
As shown in the example diagram of
As shown in the example diagram of
As shown in the example diagram of
In the example illustrated in
In the example illustrated in
In the example illustrated in
In the example illustrated in
In some examples, the personalized user agent 320 includes an example personal learning controller 332 to analyze and understand input from the data collector 110 and predict data collector characteristics based on the input. In some examples, the personal learning controller 332 includes an example personal model trainer 328 and example personal model executor 330. In some examples, the personal model trainer 328 applies an algorithm (e.g., a personal learning algorithm) to first input from the data collector 110 (e.g., training data), and the personal model executor 330 executes a personalized model based on second input from the data collector 110.
In some examples, the personalized user agent 320 illustrated in
In the example illustrated in
In the example illustrated in
In the example illustrated in
In the example illustrated in
The example personalized user agent 420 of
In some examples, the personalized user agent 420 includes a personal learning controller 432 to analyze and understand input from the data collector 110 and predict data collector characteristics based on the input. In some examples, the personal learning controller 432 includes an example personal model trainer 428 and an example personal model executor 430. In some examples, the personal model trainer 428 applies an algorithm (e.g., a personal learning algorithm) to first input from the data collector 110 (e.g., training data) and the personal model executor 430 executes a personalized model based on second input from the data collector 110.
In some examples, the personalized user agent 420 illustrated in
The user devices 520a-c may implement corresponding personalized user agents such as the personalized user agents 120 (
The example system 500 illustrated in
In the example system 500 illustrated in
In the example system 500 illustrated in
In some examples, the user devices 520a-c learn and associate scores with the respective data collectors 510a-c based on skills and interests of the data collectors 510a-c. For example, the data collectors 510a-c may be assigned a score that reflects the interests and skills of the respective data collector 510a, 510b, 510c in executing a work order with a characteristic. For example, the data collector 510a may have a strong interest and skill level in photography, and thus, may be associated with a high score in photography. The data collector 510b may have strong interpersonal skills and enjoy talking to people, and thus, data collector 510b may be associated with a high interpersonal score. The data collector 510c may have excellent performance ratings, and thus, may be associated with a high reliability and/or performance score.
In some examples, a score associated with a respective data collector 510a-c, may be a combination of sub-scores related to interests of the data collector 510a-c, skills of the data collector 510a-c, and/or other information about the data collector 510a-c. The score associated with a respective data collector 510a-c may be used to classify the data collector 510a-c and determine which type of task is best suited for the data collector 510a-c.
In the illustrated example of
In the illustrated example of
In the illustrated example of
The classification learning controller 643 illustrated in
In the illustrated example of
The classification agent 540 illustrated in
While an example manner of implementing the classification agent 540 of
Flowcharts representative of example hardware logic, machine readable instructions, hardware implemented state machines, and/or any combination thereof for implementing the personalized user agent 120, the help desk system 130, the classification system 140, and/or the distribution system 150 of
The machine readable instructions described herein may be stored in one or more of a compressed format, an encrypted format, a fragmented format, a compiled format, an executable format, a packaged format, etc. Machine readable instructions as described herein may be stored as data or a data structure (e.g., portions of instructions, code, representations of code, etc.) that may be utilized to create, manufacture, and/or produce machine executable instructions. For example, the machine readable instructions may be fragmented and stored on one or more storage devices and/or computing devices (e.g., servers) located at the same or different locations of a network or collection of networks (e.g., in the cloud, in edge devices, etc.). The machine readable instructions may require one or more of installation, modification, adaptation, updating, combining, supplementing, configuring, decryption, decompression, unpacking, distribution, reassignment, compilation, etc. in order to make them directly readable, interpretable, and/or executable by a computing device and/or other machine. For example, the machine readable instructions may be stored in multiple parts, which are individually compressed, encrypted, and stored on separate computing devices, wherein the parts when decrypted, decompressed, and combined form a set of executable instructions that implement one or more functions that may together form a program such as that described herein.
In another example, the machine readable instructions may be stored in a state in which they may be read by processor circuitry, but require addition of a library (e.g., a dynamic link library (DLL)), a software development kit (SDK), an application programming interface (API), etc. in order to execute the instructions on a particular computing device or other device. In another example, the machine readable instructions may need to be configured (e.g., settings stored, data input, network addresses recorded, etc.) before the machine readable instructions and/or the corresponding program(s) can be executed in whole or in part. Thus, machine readable media, as used herein, may include machine readable instructions and/or program(s) regardless of the particular format or state of the machine readable instructions and/or program(s) when stored or otherwise at rest or in transit.
The machine readable instructions described herein can be represented by any past, present, or future instruction language, scripting language, programming language, etc. For example, the machine readable instructions may be represented using any of the following languages: C, C++, Java, C #, Perl, Python, JavaScript, HyperText Markup Language (HTML), Structured Query Language (SQL), Swift, etc.
As mentioned above, the example processes of
“Including” and “comprising” (and all forms and tenses thereof) are used herein to be open ended terms. Thus, whenever a claim employs any form of “include” or “comprise” (e.g., comprises, includes, comprising, including, having, etc.) as a preamble or within a claim recitation of any kind, it is to be understood that additional elements, terms, etc. may be present without falling outside the scope of the corresponding claim or recitation. As used herein, when the phrase “at least” is used as the transition term in, for example, a preamble of a claim, it is open-ended in the same manner as the term “comprising” and “including” are open ended. The term “and/or” when used, for example, in a form such as A, B, and/or C refers to any combination or subset of A, B, C such as (1) A alone, (2) B alone, (3) C alone, (4) A with B, (5) A with C, (6) B with C, and (7) A with B and with C. As used herein in the context of describing structures, components, items, objects and/or things, the phrase “at least one of A and B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B. Similarly, as used herein in the context of describing structures, components, items, objects and/or things, the phrase “at least one of A or B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B. As used herein in the context of describing the performance or execution of processes, instructions, actions, activities and/or steps, the phrase “at least one of A and B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B. Similarly, as used herein in the context of describing the performance or execution of processes, instructions, actions, activities and/or steps, the phrase “at least one of A or B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B.
As used herein, singular references (e.g., “a”, “an”, “first”, “second”, etc.) do not exclude a plurality. The term “a” or “an” entity, as used herein, refers to one or more of that entity. The terms “a” (or “an”), “one or more”, and “at least one” can be used interchangeably herein. Furthermore, although individually listed, a plurality of means, elements or method actions may be implemented by, e.g., a single or processor. Additionally, although individual features may be included in different examples or claims, these may possibly be combined, and the inclusion in different examples or claims does not imply that a combination of features is not feasible and/or advantageous.
The example program 700 of
Another example flowchart representative of example programs 800 of
At block 802, the classification learning controller 643 (
At block 804, the classification learning controller 643 selects a class based on a requested characteristic of a task request. The classification learning controller 643 may select a class in response to receiving the task request from a distribution agent (e.g., the distribution agent 550 of
At block 806, the selection generator 644 (
At block 808, the data interface 641 sends (e.g., transmits) the selection to a distribution agent (e.g., the distribution agent 550 of
At block 810, the distribution agent 550 illustrated in
At block 814, a user device 520a-c illustrated in
At block 822, the distribution agent 550 generates an assignment based on the task request. The distribution agent 550 sends (e.g., transmits) the assignment to the user device 520a-c (block 824). In some examples, the assignment includes further details and/or instructions relating to the task, such as location, requirements, pay, expectations, and/or criteria associated with the task, and/or any other information related to the task. At block 826, the distribution agent 550 sends (e.g., transmits) the acceptance or rejection of the work order to the classification agent 540.
At block 828, the classification agent 540 (
At block 902, the classification agent 540 classifies data collectors (e.g., the data collectors 510a-c of
At block 1010, the classification agent 540 classifies data collectors (e.g., the data collectors 510a-c of
At block 1110, the user device 520a-c sends (e.g., transmits) information and/or a help request to a help desk agent (e.g., the help desk agent 530 of
At block 1120, the user device 520a-c receives training, tutorials, troubleshooting, guidance, and/or other assistance (e.g., a response to the help request) from the help desk agent 530. For example, the user device 520a may receive a photography tutorial from the help desk agent 530.
At block 1130, the user device 520a-c presents the training, tutorials, troubleshooting, guidance, and/or other assistance to the data collector 510a-c. For example, the user device 520a may present the photography tutorial to the data collector 510a.
At block 1140, the user device 520a-c updates data collector characteristics and/or a data collector score. The user device 520a-c may perform the updates of block 1140 based on completion of the training, tutorials, troubleshooting, or other form(s) of assistance. For example, the user device 520a may update the photography skill level and/or a photography skill level score of the data collector 510a based on completion of the photography tutorial.
The processor platform 1200 of the illustrated example includes a processor 1212. The processor 1212 of the illustrated example is hardware. For example, the processor 1212 can be implemented by one or more integrated circuits, logic circuits, microprocessors, GPUs, DSPs, or controllers from any desired family or manufacturer. The hardware processor may be a semiconductor-based (e.g., silicon-based) device. In this example, the processor 1212 implements the example classification algorithms 342 and 442, the example preferential learning (score computation) algorithms 344 and 426, the relevance ranking an scoring algorithms 444, and the example collaborative algorithms 346 and 446 of
The processor 1212 of the illustrated example includes a local memory 1213 (e.g., a cache). The processor 1212 of the illustrated example is in communication with a main memory including a volatile memory 1214 and a non-volatile memory 1216 via a bus 1218. The volatile memory 1214 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS® Dynamic Random Access Memory (RDRAM®) and/or any other type of random access memory device. The non-volatile memory 1216 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 1214, 1216 is controlled by a memory controller. The example memory 645 of the example classification agent 540 illustrated in
The processor platform 1200 of the illustrated example also includes an interface circuit 1220. The interface circuit 1220 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), a Bluetooth® interface, a near field communication (NFC) interface, and/or a PCI express interface.
In the illustrated example, one or more input devices 1222 are connected to the interface circuit 1220. The input device(s) 1222 permit(s) a user to enter data and/or commands into the processor 1212. The input device(s) 1222 can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 1224 are also connected to the interface circuit 1220 of the illustrated example. The output devices 1224 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-place switching (IPS) display, a touchscreen, etc.), a tactile output device, a printer and/or speaker. The interface circuit 1220 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
The interface circuit 1220 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, a wireless access point, and/or a network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 1226. The communication can be via, for example, an Ethernet connection, a digital subscriber line (DSL) connection, a telephone line connection, a coaxial cable system, a satellite system, a line-of-site wireless system, a cellular telephone system, etc.
The processor platform 1200 of the illustrated example also includes one or more mass storage devices 1228 for storing software and/or data. Examples of such mass storage devices 1228 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, redundant array of independent disks (RAID) systems, and digital versatile disk (DVD) drives.
Example machine executable instructions 1232 represented in
The processor platform 1300 of the illustrated example includes a processor 1312. The processor 1312 of the illustrated example is hardware. For example, the processor 1312 can be implemented by one or more integrated circuits, logic circuits, microprocessors, GPUs, DSPs, or controllers from any desired family or manufacturer. The hardware processor may be a semiconductor-based (e.g., silicon-based) device. In this example, the processor 1312 implements the example personal learning controllers 332 and 432 (
The processor 1312 of the illustrated example includes a local memory 1313 (e.g., a cache). The processor 1312 of the illustrated example is in communication with a main memory including a volatile memory 1314 and a non-volatile memory 1316 via a bus 1318. The volatile memory 1314 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS® Dynamic Random Access Memory (RDRAM®) and/or any other type of random access memory device. The non-volatile memory 1316 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 1314, 1316 is controlled by a memory controller.
The processor platform 1300 of the illustrated example also includes an interface circuit 1320. The interface circuit 1320 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), a Bluetooth® interface, a near field communication (NFC) interface, and/or a PCI express interface.
In the illustrated example, one or more input devices 1322 are connected to the interface circuit 1320. The input device(s) 1322 permit(s) a user to enter data and/or commands into the processor 1312. The input device(s) 1322 can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 1324 are also connected to the interface circuit 1320 of the illustrated example. The output devices 1324 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-place switching (IPS) display, a touchscreen, etc.), a tactile output device, a printer and/or speaker. The interface circuit 1320 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
The interface circuit 1320 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, a wireless access point, and/or a network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 1326. The communication can be via, for example, an Ethernet connection, a digital subscriber line (DSL) connection, a telephone line connection, a coaxial cable system, a satellite system, a line-of-site wireless system, a cellular telephone system, etc.
The processor platform 1300 of the illustrated example also includes one or more mass storage devices 1328 for storing software and/or data. Examples of such mass storage devices 1328 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, redundant array of independent disks (RAID) systems, and digital versatile disk (DVD) drives.
Example machine executable instructions 1332 represented in
A block diagram of an example software distribution platform 1405 to distribute software such as the example computer readable instructions 1232 of
The disclosed methods, apparatus and articles of manufacture improve the efficiency of using a computing device by using artificial intelligence/machine learning to learn characteristics of data collectors and automatically assign tasks to data collectors based on the learned characteristics. The disclosed methods, apparatus and articles of manufacture are accordingly directed to one or more improvement(s) in the functioning of a computer.
In some examples, an example apparatus includes a classification learning controller to associate a data collector with a class by executing a classification model using a first data collector characteristic, the first data collector characteristic corresponding to the data collector, the classification model generated by applying a learning algorithm to classification training data, the classification training data including second data collector characteristics of a training group; a selection generator to select the class based on a requested characteristic of a task request from a distribution agent and select the data collector associated with the class; and a data interface to send the selection to the distribution agent.
In some examples, the first data collector characteristic includes at least one of a skill level of the data collector, a performance rating of the data collector, one or more interests of the data collector, a location of the data collector, or device information of the data collector.
In some examples, the learning algorithm is at least one of a classification algorithm, a preferential learning algorithm, a relevance ranking and scoring algorithm, or a collaborative algorithm.
In some examples, the classification learning controller is to update the classification model based on an acceptance or rejection of the task request.
In some examples, the apparatus includes a personalized user agent, the personalized user agent including a personal learning controller to accept or reject the task request by executing a personal model, the personal model generated by applying a personal learning algorithm to personal training data based on first user input.
In some examples, the personal learning algorithm associated with the personal learning controller is at least one of a natural language understanding algorithm, a preferential learning algorithm, or a relevance ranking and scoring algorithm.
In some examples, the personalized user agent periodically engages the data collector by prompting the data collector to provide second user input and updates the personal model based on the second user input.
In some examples, the task request includes at least one of a request to capture a photograph, log data, write a description, or answer a questionnaire.
In some examples, a non-transitory computer readable medium includes computer readable instructions that, when executed, cause at least one processor to at least associate a data collector with a class by executing a classification model using a first data collector characteristic, the first data collector characteristic corresponding to the data collector, the classification model generated by applying a learning algorithm to classification training data, the classification training data including second data collector characteristics of a training group; select the class based on a requested characteristic of a task request from a distribution agent; select the data collector associated with the class; and transmit the selection to the distribution agent.
In some examples, the first data collector characteristic includes at least one of a skill level of the data collector, a performance rating of the data collector, one or more interests of the data collector, a location of the data collector, or device information of the data collector.
In some examples, the learning algorithm is at least one of a classification algorithm, a preferential learning algorithm, a relevance ranking and scoring algorithm, or a collaborative algorithm.
In some examples, the computer readable instructions are further to cause the at least one processor to update the classification model based on an acceptance or rejection of the task request.
In some examples, the task request includes at least one of a request to capture a photograph, log data, write a description, or answer a questionnaire.
In some examples, a method includes associating, by executing an instruction with a processor, a data collector with a class by executing a classification model using a first data collector characteristic, the first data collector characteristic corresponding to the data collector, the classification model generated by applying a learning algorithm to classification training data, the classification training data including second data collector characteristics of a training group; in response to receiving a task request from a distribution agent, selecting, by executing an instruction with the processor, the class based on a requested characteristic of the task request; selecting, by executing an instruction with the processor, the data collector associated with the class; and sending, by executing an instruction with the processor, the selection to the distribution agent.
In some examples, the first data collector characteristic includes at least one of a skill level of the data collector, a performance rating of the data collector, one or more interests of the data collector, a location of the data collector, or device information of the data collector.
In some examples, the learning algorithm is at least one of a classification algorithm, a preferential learning algorithm, a relevance ranking and scoring algorithm, or a collaborative algorithm.
In some examples, the method includes updating the classification model based on an acceptance or rejection of the task request.
In some examples, the task request includes at least one of a request to capture a photograph, log data, write a description, or answer a questionnaire.
In some examples, the method includes accepting or rejecting, by a personalized user agent, the task request by executing a personal model, the personal model generated by applying a personal learning algorithm to personal training data based on first user input.
In some examples, the personalized user agent updates the personal model based on second user input.
In some examples, the personal learning algorithm is at least one of a natural language understanding algorithm, a preferential learning algorithm, or a relevance ranking and scoring algorithm.
In some examples, the personalized user agent periodically engages the data collector by prompting the data collector to provide user input.
In some examples, the personalized user agent periodically engages the data collector using a chatbot.
Although certain example methods, apparatus and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus and articles of manufacture fairly falling within the scope of the claims of this patent.
Number | Date | Country | Kind |
---|---|---|---|
202011033521 | Aug 2020 | IN | national |