Changes in computing technologies have provided individuals with additional options for obtaining and validating technical skills and proficiencies. Rather than attending traditional educational institutions and professional training courses, many individuals may now obtain their technical skills and proficiencies from alternative sources, such as structured or unstructured and asynchronous eLearning programs using distance learning technology, self-study research without any direct supervision, or various alternative technical learning, training, and testing entities. Although such advances in technologies and increasing globalization trends provide many more options for individuals to obtain technical skills and proficiencies, they also present challenges in publishing, verifying, and tracking the sets of technical skills and proficiencies that these individuals have obtained. Many individuals and institutions no longer rely on physical certificates such as diplomas, transcripts, certification statements, and physical licenses, to verify the authenticity of an individual's proficiencies or qualifications. Instead, certain institutions may issue digital credentials (or digital badges) to qualifying individuals, and these digital credential earners may use the digital credentials to certify the skills or qualifications that the earner obtained vis-à-vis the institution.
Various techniques are described herein for executing and monitoring physical simulations within a digital credential platform. In various embodiments, techniques for generating digital credentials may include using physical simulation evaluation systems include sensor-based monitoring systems, simulation output systems, and/or simulation environment control systems. Within such systems, particular types of digital credential simulations may be determined and executed within a physical environment, during which a plurality of sensors may be used to monitor the physical actions of a user (or credential receiver) during the physical simulations. The physical action data may be analyzed to determine particular physical activities performed by the user, and the characteristics of the those physical activities, such as speed, efficiency, error rate, etc. The physical activities performed may be analyzed and compared to digital credential requirements to determine one or more digital credentials to be generated and issued to the credential receiver, based on the monitored activities. In certain embodiments, the physical simulation may be generated by controlling one or more output systems and/or environmental control systems. Additionally, the usage and configuration of sensors during monitoring may be based the physical simulation being executed. Such simulations may include, for example, computer terminal-based physical simulations, and/or simulations requiring physical activities within the simulation environment.
Additional techniques are described herein for generating and issuing digital credentials to digital credential receivers, based on actions detected within sensor-monitored environments. In certain embodiments, a digital credential generator may include an operation evaluation system having a plurality of configurable sensors directed to detect user activity within a physical environment. The evaluation system may monitor a physical environment associated with a user, in order to detect various user actions performed by the user within the environment during a predetermined time period. Sets of user operations may be determined based on the physical actions, and the user operations may be compared to digital credential criteria associated with a plurality of different digital credential types. When determining that user (or credential receiver) is eligible to receive one or more digital credentials based on the comparisons, the system may generate digital credentials based on the corresponding digital credential templates, and issue the digital credentials to the associated credential receivers. In some examples, digital credentials based on the analyses from operation evaluation systems may include embedded credentialing time data, credentialing location data, credentialing sensor system data, and the like. Sensors used by an operation evaluation systems may include, for example, software-based sensors and/or video or motion detection and analysis sensors. Additional sensors and monitoring techniques used during credentialing determinations may include biometric analysis and/or facial recognition for authentication, and/or credentialing.
Further techniques are described herein for tracking and analyzing digital credential usage in sensor-monitored environments. In certain embodiments, a digital credential platform server may be configured to receive data identifying credential receivers, and then to retrieve the digital credentials generated and issued to those credential receivers. The digital credential platform server may then determine sets of physical activities associated with the digital credentials issued to a credential receiver, and may use sensor-based monitoring systems to detect the user actions of the credential receiver and compare those actions to the physical activities associated with the credentials issued to that receiver. Such sensor-based monitoring systems may include computer-terminal based systems and/or larger scale physical monitoring environments. The comparisons between the user actions detected for the credential receiver, and the physical activities associated with the receiver's credentials, may be used to determine re-credentialing time periods, credential expiration times, etc. Additionally, in some cases, comparisons may include detecting a number of particular action types performed by the credential receiver, error rates, compliance with protocols, etc.
Additional techniques described herein may include generating and issuing digital credentials within a credentialing environment, including storing digital credentials with associated sensor data collected via a sensor-monitored environment. For example, a digital credential generation system may include a sensor-based monitoring or detection system, along with digital credential generation and issuing components. During an evaluation of a credential receiver, and generation/issuance of digital credentials to the receiver, the receiver may be monitored using various sensors. The digital credential generation system may determine the relevant sensor data, discarding unnecessary sensor data in some cases, and may store the relevant sensor data in a digital credential storage repository associated with the particular credential receiver. The associated sensor data may serve as authentication data and/or evidence of the completion of the credential criteria by the receiver. Additionally, in some cases, the stored sensor data may be automatically applied to additional digital credential criteria, such as updated criteria from the same digital credential, or criteria for different types of digital credentials, and the system may determine the receiver's eligibility based on the additional criteria. Thus, such techniques may allow the digital credential generation system to automatically generate and issue updated and/or additional digital credentials to receivers based on analyses of previously collected sensor data records, rather than requiring retesting or reevaluation by the receiver.
The techniques described herein may further include analysis, generation, and issuance of digital credentials based on feedback associated with credential receivers. For example, digital credential generation systems may use sensor detection systems to monitor the user actions of the credential receiver and capture sensor data of the user during the interactive assessments and other credentialing processes. Such captured sensor data may be, in some examples, non-verbal, non-written user response data detected during interactive computer-based assessment sessions, and may be collected using various devices such as biometric sensors, video or audio recording devices, facial expression or gesture capturing and analysis software, etc. The captured sensor data then may be stored as feedback data, along with the responses from the receiver collected during the interactive assessments. Such feedback data may be evaluated and used to authenticate the user, determine user emotion/response data during certain simulations and assessments. Additionally, the sensor feedback data collected for the user may be used in eligibility determinations for digital credentials in certain embodiments.
In the appended figures, similar components and/or features may have the same reference label. Further, various compo of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
The ensuing description provides illustrative embodiment(s) only and is not intended to limit the scope, applicability or configuration of the disclosure. Rather, the ensuing description of the illustrative embodiment(s) will provide those skilled in the art with an enabling description for implementing a preferred exemplary embodiment. It is understood that various changes can be made in the function and arrangement of elements without departing from the spirit and scope as set forth in the appended claims.
Various techniques (e.g., systems, methods, computer-program products tangibly embodied in a non-transitory machine-readable storage medium, etc.) are described herein for executing and monitoring physical simulations within a digital credential platform. In various embodiments, techniques for generating digital credentials may include using physical simulation evaluation systems include sensor-based monitoring systems, simulation output systems, and/or simulation environment control systems. Within such systems, particular types of digital credential simulations may be determined and executed within a physical environment, during which a plurality of sensors may be used to monitor the physical actions of a user (or credential receiver) during the physical simulations. The physical action data may be analyzed to determine particular physical activities performed by the user, and the characteristics of the those physical activities, such as speed, efficiency, error rate, etc. The physical activities performed may be analyzed and compared to digital credential requirements to determine one or more digital credentials to be generated and issued to the credential receiver, based on the monitored activities. In certain embodiments, the physical simulation may be generated by controlling one or more output systems and/or environmental control systems. Additionally, the usage and configuration of sensors during monitoring may be based the physical simulation being executed. Such simulations may include, for example, computer terminal-based physical simulations, and/or simulations requiring physical activities within the simulation environment.
Additional techniques are described herein for generating and issuing digital credentials to digital credential receivers, based on actions detected within sensor-monitored environments. In certain embodiments, a digital credential generator may include an operation evaluation system having a plurality of configurable sensors directed to detect user activity within a physical environment. The evaluation system may monitor a physical environment associated with a user, in order to detect various user actions performed by the user within the environment during a predetermined time period. Sets of user operations may be determined based on the physical actions, and the user operations may be compared to digital credential criteria associated with a plurality of different digital credential types. When determining that user (or credential receiver) is eligible to receive one or more digital credentials based on the comparisons, the system may generate digital credentials based on the corresponding digital credential templates, and issue the digital credentials to the associated credential receivers. In some examples, digital credentials based on the analyses from operation evaluation systems may include embedded credentialing time data, credentialing location data, credentialing sensor system data, and the like. Sensors used by an operation evaluation systems may include, for example, software-based sensors and/or video or motion detection and analysis sensors. Additional sensors and monitoring techniques used during credentialing determinations may include biometric analysis and/or facial recognition for authentication, and/or credentialing.
Further techniques are described herein for tracking and analyzing digital credential usage in sensor-monitored environments. In certain embodiments, a digital credential platform server may be configured to receive data identifying credential receivers, and then to retrieve the digital credentials generated and issued to those credential receivers. The digital credential platform server may then determine sets of physical activities associated with the digital credentials issued to a credential receiver, and may use sensor-based monitoring systems to detect the user actions of the credential receiver and compare those actions to the physical activities associated with the credentials issued to that receiver. Such sensor-based monitoring systems may include computer-terminal based systems and/or larger scale physical monitoring environments. The comparisons between the user actions detected for the credential receiver, and the physical activities associated with the receiver's credentials, may be used to determine re-credentialing time periods, credential expiration times, etc. Additionally, in some cases, comparisons may include detecting a number of particular action types performed by the credential receiver, error rates, compliance with protocols, etc.
Additional techniques described herein may include generating and issuing digital credentials within a credentialing environment, including storing digital credentials with associated sensor data collected via a sensor-monitored environment. For example, a digital credential generation system may include a sensor-based monitoring or detection system, along with digital credential generation and issuing components. During an evaluation of a credential receiver, and generation/issuance of digital credentials to the receiver, the receiver may be monitored using various sensors. The digital credential generation system may determine the relevant sensor data, discarding unnecessary sensor data in some cases, and may store the relevant sensor data in a digital credential storage repository associated with the particular credential receiver. The associated sensor data may serve as authentication data and/or evidence of the completion of the credential criteria by the receiver. Additionally, in some cases, the stored sensor data may be automatically applied to additional digital credential criteria, such as updated criteria from the same digital credential, or criteria for different types of digital credentials, and the system may determine the receiver's eligibility based on the additional criteria. Thus, such techniques may allow the digital credential generation system to automatically generate and issue updated and/or additional digital credentials to receivers based on analyses of previously collected sensor data records, rather than requiring retesting or reevaluation by the receiver.
The techniques described herein may further include analysis, generation, and issuance of digital credentials based on feedback associated with credential receivers. For example, digital credential generation systems may use sensor detection systems to monitor the user actions of the credential receiver and capture sensor data of the user during the interactive assessments and other credentialing processes. Such captured sensor data may be, in some examples, non-verbal, non-written user response data detected during interactive computer-based assessment sessions, and may be collected using various devices such as biometric sensors, video or audio recording devices, facial expression or gesture capturing and analysis software, etc. The captured sensor data then may be stored as feedback data, along with the responses from the receiver collected during the interactive assessments. Such feedback data may be evaluated and used to authenticate the user, determine user emotion/response data during certain simulations and assessments. Additionally, the sensor feedback data collected for the user may be used in eligibility determinations for digital credentials in certain embodiments.
With reference now to
The content distribution network 100 may include one or more data store servers 104, such as database servers and file-based storage systems. Data stores 104 may comprise stored data relevant to the functions of the content distribution network 100. Illustrative examples of data stores 104 that may be maintained in certain embodiments of the content distribution network 100 are described below in reference to
Content distribution network 100 also may include one or more user devices 106 and/or supervisor devices 110. User devices 106 and supervisor devices 110 may display content received via the content distribution network 100, and may support various types of user interactions with the content. User devices 106 and supervisor devices 110 may include mobile devices such as smartphones, tablet computers, personal digital assistants, and wearable computing devices. Such mobile devices may run a variety of mobile operating systems, and may be enabled for Internet, e-mail, short message service (SMS), Bluetooth®, mobile radio-frequency identification (M-RFID), and/or other communication protocols. Other user devices 106 and supervisor devices 110 may be general purpose personal computers or special-purpose computing devices including, by way of example, personal computers, laptop computers, workstation computers, projection devices, and interactive room display systems. Additionally, user devices 106 and supervisor devices 110 may be any other electronic devices, such as thin-client computers, Internet-enabled gaming systems, business or home appliances, and/or personal messaging devices, capable of communicating over network(s) 120.
In different contexts of content distribution networks 100, user devices 106 and supervisor devices 110 may correspond to different types of specialized devices, for example, student devices and teacher devices in an educational network, employee devices and presentation devices in a company network, different gaming devices in a gaming network, etc. In some embodiments, user devices 106 and supervisor devices 110 may operate in the same physical location 107, such as a classroom or conference room. In such cases, the devices may contain components that support direct communications with other nearby devices, such as a wireless transceivers and wireless communications interfaces, Ethernet sockets or other Local Area Network (LAN) interfaces, etc. In other implementations, the user devices 106 and supervisor devices 110 need not be used at the same location 107, but may be used in remote geographic locations in which each user device 106 and supervisor device 110 may use security features and/or specialized hardware (e.g., hardware-accelerated SSL and HTTPS, WS-Security, firewalls, etc.) to communicate with the content management server 102 and/or other remotely located user devices 106. Additionally, different user devices 106 and supervisor devices 110 may be assigned different designated roles, such as presenter devices, teacher devices, administrator devices, or the like, and in such cases the different devices may be provided with additional hardware and/or software components to provide content and support user capabilities not available to the other devices.
The content distribution network 100 also may include a privacy server 108 that maintains private user information at the privacy server 108 while using applications or services hosted on other servers. For example, the privacy server 108 may be used to maintain private data of a user within one jurisdiction even though the user is accessing an application hosted on a server (e.g., the content management server 102) located outside the jurisdiction. In such cases, the privacy server 108 may intercept communications between a user device 106 or supervisor device 110 and other devices that include private user information. The privacy server 108 may create a token or identifier that does not disclose the private information and may use the token or identifier when communicating with the other servers and systems, instead of using the user's private information.
As illustrated in
Content server 112 may include hardware and software components to generate, store, and maintain the content resources for distribution to user devices 106 and other devices in the network 100. For example, in content distribution networks 100 used for professional training and educational purposes, content server 112 may include data stores of training materials, presentations, interactive programs and simulations, course models, course outlines, and various training interfaces that correspond to different materials and/or different types of user devices 106. In content distribution networks 100 used for media distribution, interactive gaming, and the like, a content server 112 may include media content files such as music, movies, television programming, games, and advertisements.
User data server 114 may include hardware and software components that store and process data for multiple users relating to each user's activities and usage of the content distribution network 100. For example, the content management server 102 may record and track each user's system usage, including their user device 106, content resources accessed, and interactions with other user devices 106. This data may be stored and processed by the user data server 114, to support user tracking and analysis features. For instance, in the professional training and educational contexts, the user data server 114 may store and analyze each user's training materials viewed, presentations attended, courses completed, interactions, evaluation results, and the like. The user data server 114 may also include a repository for user-generated material, such as evaluations and tests completed by users, and documents and assignments prepared by users. In the context of media distribution and interactive gaming, the user data server 114 may store and process resource access data for multiple users (e.g., content titles accessed, access times, data usage amounts, gaming histories, user devices and device types, etc.).
Administrator server 116 may include hardware and software components to initiate various administrative functions at the content management server 102 and other components within the content distribution network 100. For example, the administrator server 116 may monitor device status and performance for the various servers, data stores, and/or user devices 106 in the content distribution network 100. When necessary, the administrator server 116 may add or remove devices from the network 100, and perform device maintenance such as providing software updates to the devices in the network 100. Various administrative tools on the administrator server 116 may allow authorized users to set user access permissions to various content resources, monitor resource usage by users and devices 106, and perform analyses and generate reports on specific network users and/or devices (e.g., resource usage tracking reports, training evaluations, etc.).
The content distribution network 100 may include one or more communication networks 120. Although only a single network 120 is identified in
With reference to
Client devices 206 may be configured to receive and execute client applications over one or more networks 220. Such client applications may be web browser based applications and/or standalone software applications, such as mobile device applications. Server 202 may be communicatively coupled with the client devices 206 via one or more communication networks 220. Client devices 206 may receive client applications from server 202 or from other application providers (e.g., public or private application stores). Server 202 may be configured to run one or more server software applications or services, for example, web-based or cloud-based services, to support content distribution and interaction with client devices 206. Users operating client devices 206 may in turn utilize one or more client applications (e.g., virtual client applications) to interact with server 202 to utilize the services provided by these components.
Various different subsystems and/or components 204 may be implemented on server 202. Users operating the client devices 206 may initiate one or more client applications to use services provided by these subsystems and components. The subsystems and components within the server 202 and client devices 206 may be implemented in hardware, firmware, software, or combinations thereof. Various different system configurations are possible in different distributed computing systems 200 and content distribution networks 100. The embodiment shown in
Although exemplary computing environment 200 is shown with four client computing devices 206, any number of client computing devices may be supported. Other devices, such as specialized sensor devices, etc., may interact with client devices 206 and/or server 202.
As shown in
Security and integration components 208 may implement various security features for data transmission and storage, such as authenticating users and restricting access to unknown or unauthorized users. In various implementations, security and integration components 208 may provide, for example, a file-based integration scheme or a service-based integration scheme for transmitting data between the various devices in the content distribution network 100. Security and integration components 208 also may use secure data transmission protocols and/or encryption for data transfers, for example, File Transfer Protocol (FTP), Secure File Transfer Protocol (SFTP), and/or Pretty Good Privacy (PGP) encryption.
In some embodiments, one or more web services may be implemented within the security and integration components 208 and/or elsewhere within the content distribution network 100. Such web services, including cross-domain and/or cross-platform web services, may be developed for enterprise use in accordance with various web service standards, such as RESTful web services (i.e., services based on the Representation State Transfer (REST) architectural style and constraints), and/or web services designed in accordance with the Web Service Interoperability (WS-I) guidelines. Some web services may use the Secure Sockets Layer (SSL) or Transport Layer Security (TLS) protocol to provide secure connections between the server 202 and user devices 206. SSL or TLS may use HTTP or HTTPS to provide authentication and confidentiality. In other examples, web services may be implemented using REST over HTTPS with the OAuth open standard for authentication, or using the WS-Security standard which provides for secure SOAP messages using XML, encryption. In other examples, the security and integration components 208 may include specialized hardware for providing secure web services. For example, security and integration components 208 may include secure network appliances having built-in features such as hardware-accelerated SSL and HTTPS, WS-Security, and firewalls. Such specialized hardware may be installed and configured in front of any web servers, so that any external devices may communicate directly with the specialized hardware.
Communication network(s) 220 may be any type of network familiar to those skilled in the art that can support data communications using any of a variety of commercially-available protocols, including without limitation, TCP/IP (transmission control protocol/Internet protocol), SNA (systems network architecture), IPX (Internet packet exchange), Secure Sockets Layer (SSL) or Transport Layer Security (TLS) protocols, Hyper Text Transfer Protocol (HTTP) and Secure Hyper Text Transfer Protocol (HTTPS), Bluetooth®, Near Field Communication (NFC), and the like. Merely by way of example, network(s) 220 may be local area networks (LAN), such as one based on Ethernet, Token-Ring and/or the like. Network(s) 220 also may be wide-area networks, such as the Internet. Networks 220 may include telecommunication networks such as a public switched telephone networks (PSTNs), or virtual networks such as an intranet or an extranet. Infrared and wireless networks (e.g., using the Institute of Electrical and Electronics (IEEE) 802.11 protocol suite or other wireless protocols) also may be included in networks 220.
Computing environment 200 also may include one or more data stores 210 and/or back-end servers 212. In certain examples, the data stores 210 may correspond to data store server(s) 104 discussed above in
With reference to
The paragraphs below describe examples of specific data stores that may be implemented within some embodiments of a content distribution network 100. It should be understood that the below descriptions of data stores 301-309, including their functionality and types of data stored therein, are illustrative and non-limiting. Data stores server architecture, design, and the execution of specific data stores 301-309 may depend on the context, size, and functional requirements of a content distribution network 100. For example, in content distribution systems 100 used for professional training and educational purposes, separate databases or file-based storage systems may be implemented in data store server(s) 104 to store trainee and/or student data, trainer and/or professor data, training module data and content descriptions, training results, evaluation data, and the like. In contrast, in content distribution systems 100 used for media distribution from content providers to subscribers, separate data stores may be implemented in data stores server(s) 104 to store listings of available content titles and descriptions, content title usage statistics, subscriber profiles, account data, payment data, network usage statistics, etc.
A user profile data store 301 may include information relating to the end users within the content distribution network 100. This information may include user characteristics such as the user names, access credentials (e.g., logins and passwords), user preferences, and information relating to any previous user interactions within the content distribution network 100 (e.g., requested content, posted content, content modules completed, training scores or evaluations, other associated users, etc.).
An accounts data store 302 may generate and store account data for different users in various roles within the content distribution network 100. For example, accounts may be created in an accounts data store 302 for individual end users, supervisors, administrator users, and entities such as companies or educational institutions. Account data may include account types, current account status, account characteristics, and any parameters, limits, restrictions associated with the accounts.
A content library data store 303 may include information describing the individual content items (or content resources) available via the content distribution network 100. In some embodiments, the library data store 303 may include metadata, properties, and other characteristics associated with the content resources stored in the content server 112. Such data may identify one or more aspects or content attributes of the associated content resources, for example, subject matter, access level, or skill level of the content resources, license attributes of the content resources (e.g., any limitations and/or restrictions on the licensable use and/or distribution of the content resource), price attributes of the content resources (e.g., a price and/or price structure for determining a payment amount for use or distribution of the content resource), rating attributes for the content resources (e.g., data indicating the evaluation or effectiveness of the content resource), and the like. In some embodiments, the library data store 303 may be configured to allow updating of content metadata or properties, and to allow the addition and/or removal of information relating to the content resources. For example, content relationships may be implemented as graph structures, which may be stored in the library data store 303 or in an additional store for use by selection algorithms along with the other metadata.
A pricing data store 304 may include pricing information and/or pricing structures for determining payment amounts for providing access to the content distribution network 100 and/or the individual content resources within the network 100. In some cases, pricing may be determined based on a user's access to the content distribution network 100, for example, a time-based subscription fee, or pricing based on network usage and. In other cases, pricing may be tied to specific content resources. Certain content resources may have associated pricing information, whereas other pricing determinations may be based on the resources accessed, the profiles and/or accounts of the user, and the desired level of access (e.g., duration of access, network speed, etc.). Additionally, the pricing data store 304 may include information relating to compilation pricing for groups of content resources, such as group prices and/or price structures for groupings of resources.
A license data store 305 may include information relating to licenses and/or licensing of the content resources within the content distribution network 100. For example, the license data store 305 may identify licenses and licensing terms for individual content resources and/or compilations of content resources in the content server 112, the rights holders for the content resources, and/or common or large-scale right holder information such as contact information for rights holders of content not included in the content server 112.
A content access data store 306 may include access rights and security information for the content distribution network 100 and specific content resources. For example, the content access data store 306 may include login information (e.g., user identifiers, logins, passwords, etc.) that can be verified during user login attempts to the network 100. The content access data store 306 also may be used to store assigned user roles and/or user levels of access. For example, a user's access level may correspond to the sets of content resources and/or the client or server applications that the user is permitted to access. Certain users may be permitted or denied access to certain applications and resources based on their subscription level, training program, course/grade level, etc. Certain users may have supervisory access over one or more end users, allowing the supervisor to access all or portions of the end user's content, activities, evaluations, etc. Additionally, certain users may have administrative access over some users and/or some applications in the content management network 100, allowing such users to add and remove user accounts, modify user access permissions, perform maintenance updates on software and servers, etc.
A source data store 307 may include information relating to the source of the content resources available via the content distribution network. For example, a source data store 307 may identify the authors and originating devices of content resources, previous pieces of data and/or groups of data originating from the same authors or originating devices, and the like.
An evaluation data store 308 may include information used to direct the evaluation of users and content resources in the content management network 100. In some embodiments, the evaluation data store 308 may contain, for example, the analysis criteria and the analysis guidelines for evaluating users (e.g., trainees/students, gaming users, media content consumers, etc.) and/or for evaluating the content resources in the network 100. The evaluation data store 308 also may include information relating to evaluation processing tasks, for example, the identification of users and user devices 106 that have received certain content resources or accessed certain applications, the status of evaluations or evaluation histories for content resources, users, or applications, and the like. Evaluation criteria may be stored in the evaluation data store 308 including data and/or instructions in the form of one or several electronic rubrics or scoring guides for use in the evaluation of the content, users, or applications. The evaluation data store 308 also may include past evaluations and/or evaluation analyses for users, content, and applications, including relative rankings, characterizations, explanations, and the like.
In addition to the illustrative data stores described above, data store server(s) 104 (e.g., database servers, file-based storage servers, etc.) may include one or more external data aggregators 309. External data aggregators 309 may include third-party data sources accessible to the content management network 100, but not maintained by the content management network 100. External data aggregators 309 may include any electronic information source relating to the users, content resources, or applications of the content distribution network 100. For example, external data aggregators 309 may be third-party data stores containing demographic data, education related data, consumer sales data, health related data, and the like. Illustrative external data aggregators 309 may include, for example, social networking web servers, public records data stores, learning management systems, educational institution servers, business servers, consumer sales data stores, medical record data stores, etc. Data retrieved from various external data aggregators 309 may be used to verify and update user account information, suggest user content, and perform user and content evaluations.
With reference now to
A content management server 102 may include a content customization system 402. The content customization system 402 may be implemented using dedicated hardware within the content distribution network 100 (e.g., a content customization server 402), or using designated hardware and software resources within a shared content management server 102. In some embodiments, the content customization system 402 may adjust the selection and adaptive capabilities of content resources to match the needs and desires of the users receiving the content. For example, the content customization system 402 may query various data stores and servers 104 to retrieve user information, such as user preferences and characteristics (e.g., from a user profile data store 301), user access restrictions to content recourses (e.g., from a content access data store 306), previous user results and content evaluations (e.g., from an evaluation data store 308), and the like. Based on the retrieved information from data stores 104 and other data sources, the content customization system 402 may modify content resources for individual users.
A content management server 102 also may include a user management system 404. The user management system 404 may be implemented using dedicated hardware within the content distribution network 100 (e.g., a user management server 404), or using designated hardware and software resources within a shared content management server 102. In some embodiments, the user management system 404 may monitor the progress of users through various types of content resources and groups, such as media compilations, courses or curriculums in training or educational contexts, interactive gaming environments, and the like. For example, the user management system 404 may query one or more databases and/or data store servers 104 to retrieve user data such as associated content compilations or programs, content completion status, user goals, results, and the like.
A content management server 102 also may include an evaluation system 406. The evaluation system 406 may be implemented using dedicated hardware within the content distribution network 100 (e.g., an evaluation server 406), or using designated hardware and software resources within a shared content management server 102. The evaluation system 406 may be configured to receive and analyze information from user devices 106. For example, various ratings of content resources submitted by users may be compiled and analyzed, and then stored in a data store (e.g., a content library data store 303 and/or evaluation data store 308) associated with the content. In some embodiments, the evaluation server 406 may analyze the information to determine the effectiveness or appropriateness of content resources with, for example, a subject matter, an age group, a skill level, or the like. In some embodiments, the evaluation system 406 may provide updates to the content customization system 402 or the user management system 404, with the attributes of one or more content resources or groups of resources within the network 100. The evaluation system 406 also may receive and analyze user evaluation data from user devices 106, supervisor devices 110, and administrator servers 116, etc. For instance, evaluation system 406 may receive, aggregate, and analyze user evaluation data for different types of users (e.g., end users, supervisors, administrators, etc.) in different contexts (e.g., media consumer ratings, trainee or student comprehension levels, teacher effectiveness levels, gamer skill levels, etc.).
A content management server 102 also may include a content delivery system 408. The content delivery system 408 may be implemented using dedicated hardware within the content distribution network 100 (e.g., a content delivery server 408), or using designated hardware and software resources within a shared content management server 102. The content delivery system 408 may receive content resources from the content customization system 402 and/or from the user management system 404, and provide the resources to user devices 106. The content delivery system 408 may determine the appropriate presentation format for the content resources based on the user characteristics and preferences, and/or the device capabilities of user devices 106. If needed, the content delivery system 408 may convert the content resources to the appropriate presentation format and/or compress the content before transmission. In some embodiments, the content delivery system 408 may also determine the appropriate transmission media and communication protocols for transmission of the content resources.
In some embodiments, the content delivery system 408 may include specialized security and integration hardware 410, along with corresponding software components to implement the appropriate security features content transmission and storage, to provide the supported network and client access models, and to support the performance and scalability requirements of the network 100. The security and integration layer 410 may include some or all of the security and integration components 208 discussed above in
With reference now to
Bus subsystem 502 provides a mechanism for letting the various components and subsystems of computer system 500 communicate with each other as intended. Although bus subsystem 502 is shown schematically as a single bus, alternative embodiments of the bus subsystem may utilize multiple buses. Bus subsystem 502 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. Such architectures may include, for example, an Industry
Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, which can be implemented as a Mezzanine bus manufactured to the IEEE P1386.1 standard.
Processing unit 504, which may be implemented as one or more integrated circuits (e.g., a conventional microprocessor or microcontroller), controls the operation of computer system 500. One or more processors, including single core and/or multicore processors, may be included in processing unit 504. As shown in the figure, processing unit 504 may be implemented as one or more independent processing units 506 and/or 508 with single or multicore processors and processor caches included in each processing unit. In other embodiments, processing unit 504 may also be implemented as a quad-core processing unit or larger multicore designs (e.g., hexa-core processors, octo-core processors, ten-core processors, or greater.
Processing unit 504 may execute a variety of software processes embodied in program code, and may maintain multiple concurrently executing programs or processes. At any given time, some or all of the program code to be executed can be resident in processor(s) 504 and/or in storage subsystem 510. In some embodiments, computer system 500 may include one or more specialized processors, such as digital signal processors (DSPs), outboard processors, graphics processors, application-specific processors, and/or the like.
I/O subsystem 526 may include device controllers 528 for one or more user interface input devices and/or user interface output devices 530. User interface input and output devices 530 may be integral with the computer system 500 (e.g., integrated audio/video systems, and/or touchscreen displays), or may be separate peripheral devices which are attachable/detachable from the computer system 500.
Input devices 530 may include a keyboard, pointing devices such as a mouse or trackball, a touchpad or touch screen incorporated into a display, a scroll wheel, a click wheel, a dial, a button, a switch, a keypad, audio input devices with voice command recognition systems, microphones, and other types of input devices. Input devices 530 may also include three dimensional (3D) mice, joysticks or pointing sticks, gamepads and graphic tablets, and audio/visual devices such as speakers, digital cameras, digital camcorders, portable media players, webcams, image scanners, fingerprint scanners, barcode reader 3D scanners, 3D printers, laser rangefinders, and eye gaze tracking devices. Additional input devices 530 may include, for example, motion sensing and/or gesture recognition devices that enable users to control and interact with an input device through a natural user interface using gestures and spoken commands, eye gesture recognition devices that detect eye activity from users and transform the eye gestures as input into an input device, voice recognition sensing devices that enable users to interact with voice recognition systems through voice commands, medical imaging input devices, MIDI keyboards, digital musical instruments, and the like.
Output devices 530 may include one or more display subsystems, indicator lights, or non-visual displays such as audio output devices, etc. Display subsystems may include, for example, cathode ray tube (CRT) displays, flat-panel devices, such as those using a liquid crystal display (LCD) or plasma display, light-emitting diode (LED) displays, projection devices, touch screens, and the like. In general, use of the term “output device” is intended to include all possible types of devices and mechanisms for outputting information from computer system 500 to a user or other computer. For example, output devices 530 may include, without limitation, a variety of display devices that visually convey text, graphics and audio/video information such as monitors, printers, speakers, headphones, automotive navigation systems, plotters, voice output devices, and modems.
Computer system 500 may comprise one or more storage subsystems 510, comprising hardware and software components used for storing data and program instructions, such as system memory 518 and computer-readable storage media 516. The system memory 518 and/or computer-readable storage media 516 may store program instructions that are loadable and executable on processing units 504, as well as data generated during the execution of these programs.
Depending on the configuration and type of computer system 500, system memory 318 may be stored in volatile memory (such as random access memory (RAM) 512) and/or in non-volatile storage drives 514 (such as read-only memory (ROM), flash memory, etc.) The RAM 512 may contain data and/or program modules that are immediately accessible to and/or presently being operated and executed by processing units 504. In some implementations, system memory 518 may include multiple different types of memory, such as static random access memory (SRAM) or dynamic random access memory (DRAM). In some implementations, a basic input/output system (BIOS), containing the basic routines that help to transfer information between elements within computer system 500, such as during start-up, may typically be stored in the non-volatile storage drives 514. By way of example, and not limitation, system memory 518 may include application programs 520, such as client applications, Web browsers, mid-tier applications, server applications, etc., program data 522, and an operating system 524.
Storage subsystem 510 also may provide one or more tangible computer-readable storage media 516 for storing the basic programming and data constructs that provide the functionality of some embodiments. Software (programs, code modules, instructions) that when executed by a processor provide the functionality described herein may be stored in storage subsystem 510. These software modules or instructions may be executed by processing units 504. Storage subsystem 510 may also provide a repository for storing data used in accordance with the present invention.
Storage subsystem 300 may also include a computer-readable storage media reader that can further be connected to computer-readable storage media 516. Together and, optionally, in combination with system memory 518, computer-readable storage media 516 may comprehensively represent remote, local, fixed, and/or removable storage devices plus storage media for temporarily and/or more permanently containing, storing, transmitting, and retrieving computer-readable information.
Computer-readable storage media 516 containing program code, or portions of program code, may include any appropriate media known or used in the art, including storage media and communication media, such as but not limited to, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage and/or transmission of information. This can include tangible computer-readable storage media such as RAM, ROM, electronically erasable programmable ROM (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disk (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other tangible computer readable media. This can also include nontangible computer-readable media, such as data signals, data transmissions, or any other medium which can be used to transmit the desired information and which can be accessed by computer system 500.
By way of example, computer-readable storage media 516 may include a hard disk drive that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive that reads from or writes to a removable, nonvolatile magnetic disk, and an optical disk drive that reads from or writes to a removable, nonvolatile optical disk such as a CD ROM, DVD, and Blu-Ray® disk, or other optical media. Computer-readable storage media 516 may include, but is not limited to, Zip® drives, flash memory cards, universal serial bus (USB) flash drives, secure digital (SD) cards, DVD disks, digital video tape, and the like. Computer-readable storage media 516 may also include, solid-state drives (SSD) based on non-volatile memory such as flash-memory based SSDs, enterprise flash drives, solid state ROM, and the like, SSDs based on volatile memory such as solid state RAM, dynamic RAM, static RAM, DRAM-based SSDs, magnetoresistive RAM (MRAM) SSDs, and hybrid SSDs that use a combination of DRAM and flash memory based SSDs. The disk drives and their associated computer-readable media may provide non-volatile storage of computer-readable instructions, data structures, program modules, and other data for computer system 500.
Communications subsystem 532 may provide a communication interface from computer system 500 and external computing devices via one or more communication networks, including local area networks (LANs), wide area networks (WANs) (e.g., the Internet), and various wireless telecommunications networks. As illustrated in
The various physical components of the communications subsystem 532 may be detachable components coupled to the computer system 500 via a computer network, a FireWire® bus, or the like, and/or may be physically integrated onto a motherboard of the computer system 500. Communications subsystem 532 also may be implemented in whole or in part by software.
In some embodiments, communications subsystem 532 may also receive input communication in the form of structured and/or unstructured data feeds, event streams, event updates, and the like, on behalf of one or more users who may use or access computer system 500. For example, communications subsystem 532 may be configured to receive data feeds in real-time from users of social networks and/or other communication services, web feeds such as Rich Site Summary (RSS) feeds, and/or real-time updates from one or more third party information sources (e.g., data aggregators 309). Additionally, communications subsystem 532 may be configured to receive data in the form of continuous data streams, which may include event streams of real-time events and/or event updates (e.g., sensor data applications, financial tickers, network performance measuring tools, clickstream analysis tools, automobile traffic monitoring, etc.). Communications subsystem 532 may output such structured and/or unstructured data feeds, event streams, event updates, and the like to one or more data stores 104 that may be in communication with one or more streaming data source computers coupled to computer system 500.
Due to the ever-changing nature of computers and networks, the description of computer system 500 depicted in the figure is intended only as a specific example. Many other configurations having more or fewer components than the system depicted in the figure are possible. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, firmware, software, or a combination. Further, connection to other computing devices, such as network input/output devices, may be employed. Based on the disclosure and teachings provided herein, a person of ordinary skill in the art will appreciate other ways and/or methods to implement the various embodiments.
With reference now to
As used herein, a digital credential template (or digital badge template) may refer to an electronic document or data structure storing a general (e.g., non-user specific) template or description of a specific type of digital credential that may be issued to an individual. Digital credential templates may include, for example, a description of the skills, proficiencies, and/or achievements that the digital credential represents. This description may take the form of diploma data, certification data, and/or license data, including the parent organization (i.e., the digital credential template owner) responsible for creating and defining the digital credential template. Examples of digital credential templates may include templates for various technology certifications, licensure exams, professional tests, training course completion certificates, and the like. In contrast to a digital credential template, a digital credential (or digital badge) may refer to an instance of an electronic document or data structure, generated for a specific individual (i.e., the credential receiver), and based on a digital credential template. Thus, a digital credential document or data structure may be based on a corresponding digital credential template, but may be customized and populated with user-specific information such as individual identification data (e.g., name, email address, and other user identifiers), credential issuance data (e.g., issue date, geographic location of issuance, authorized issuer of the credential, etc.), and links or embedded data that contain the specific user's supporting documentation or evidence relating to the credential.
As shown in this example, the system 600 also may include a digital credential receiver system 640 and a digital credential endorser system 650. The digital credential receiver system 640 may be a computing device associated with a credential receiver (or credential earner), for example, an individual user of an electronic learning system, professional training system, online certification course, etc. In some embodiments, credential receivers may access the platform server 610 via systems 640 to accept or reject newly issued digital credentials, review and update their own set of previously earned digital credentials, as well as to publish (or share) their digital credentials via communication applications or publishing platforms such as social media systems. Digital credential endorser system 650 may be a computing system associated with an endorsing entity, such as an educational institution, business, or technical organization that has chosen to review and endorse a specific digital credential template. The platform server 610 may receive and track the endorsements received from systems 650, and may associate the endorsements with the user-specific digital credentials issued based on the endorsed templates.
Additionally, the digital credential management system 600 in this example includes a number of external client devices 660 and external digital credential publishers 670. External client devices 660 may correspond to computing systems of third-party users that may interact with the platform server 610 to initiate various functionality or retrieve data relating to templates and/digital credentials managed by the platform 610. For example, a client device 660 may query the platform server 610 for data metrics and/or analyses relating to a subset of digital credentials stored in the digital credential data store 615. The third-party systems 660 also may provide data to the platform server 610 that may initiate updates to the templates and/digital credentials stored in the data store 615. External digital credential publishers 670 may correspond to third-party systems configured to receive digital credential data from the platform 610 and publish (or present) the digital credential data to users. Examples of publishers 670 may include social media website and systems, digital badge wallets, and/or other specialized servers or applications configured to store and present views of digital badges to users.
In various embodiments described herein, the generation and management of digital credentials, as well as the tracking and reporting of digital credential data, may be performed within CDNs 100, such as eLearning, professional training, and certification systems 100. For example, within the context of an eLearning CDN 100, a content management server 102 or other CDN server (e.g., 104, 112, 114, or 116) may create and store digital credential templates to describe and define various proficiencies, achievements, or certifications supported by the eLearning CDN 100. Additionally or alternatively, the content management server 102 or other servers of an eLearning CDN 100 may issue digital credentials to users, based on its own digital certificate templates and/or templates received from other systems or CDNs. Further, in some implementations, an eLearning CDN 100 may be configured to include a digital credential platform server 610 to store and manage templates and digital credentials between separate systems within the CDN 100. Thus, in various different implementations, the content management server(s) 102 of a CDN 100 may incorporate one or more digital certificate template owner system(s) 620, digital certificate issuer system(s) 630, and/or digital certificate platform server(s) 610. In such embodiments, the various components and functionalities described herein for the platform server 610, owner system 620, and/or issuer system 630 all may be implemented within one or more content management servers 102 (and/or other servers) of an eLearning or professional training CDN 100. In other examples, a digital credential platform server 610 may be implemented using one or more computer servers, and other specialized hardware and software components, separately from any other CDN components such as content servers 112, content management servers 102, data store servers 104, and the like. In these examples, the digital credential platform server 610 may be configured to communicate directly with related systems 620-670, or indirectly through content management servers 102 and/or other components and communications networks of the CDN 100.
In order to perform these features and other functionality described herein, each of the components and sub-components discussed in the example digital credential management system 600 may correspond to a single computer server or a complex computing system including a combination of computing devices, storage devices, network components, etc. Each of these components and their respective subcomponents may be implemented in hardware, software, or a combination thereof. Certain systems 620-670 may communicate directly with the platform server 610, while other systems 620-670 may communicate with the platform server 610 indirectly via one or more intermediary network components (e.g., routers, gateways, firewalls, etc.) or other devices (e.g., content management servers 102, content servers 112, etc.). Although the different communication networks and physical network components have not been shown in this example so as not to obscure the other elements depicted in the figure, it should be understood that any of the network hardware components and network architecture designs may be implemented in various embodiments to support communication between the systems, servers, and devices in the digital credential management system 600. Additionally, different systems 620-670 may use different networks and networks types to communicate with the platform server 610, including one or more telecommunications networks, cable networks, satellite networks, cellular networks and other wireless networks, and computer-based IP networks, and the like. Further, certain components within the digital credential management system 600 may include special purpose hardware devices and/or special purpose software, such as those included in I/O subsystem 611 and memory 614 of the platform server 610, as well as those within the memory of the other systems 620-670, and the digital credential data store 615 maintained by the platform server 610, discussed below.
Although the various interactions between the platform server 610 and other systems 620-670 may be described below in terms of a client-server model, it should be understood that other computing environments and various combinations of servers and devices may be used to perform the functionality described herein in other embodiments. For instance, although the requests/responses to determine the authorized issuers 630 for specific digital credential templates, the generation of digital credentials, and the retrieval and presentation of digital credential tracking and reporting data, may be performed by a centralized web-based platform server 610 in collaboration with various client applications at the other systems 620-670 (e.g., web browser applications or standalone client software), in other cases these techniques may be performed entirely by a specialized digital credential platform server 610, or entirely by one or more digital credential tools (e.g., software services) executing on any one of the systems 620-670. In other examples, a client-server model may be used as shown in system 600, but different functional components and processing tasks may be allocated to the client-side or the sever-side in different embodiments. Additionally, the digital credential data store 615 may be implemented as separate servers or storage systems in some cases, and may use independent hardware and software service components. However, in other implementations, some or all of the digital credential data store 615 may be incorporated into the platform server 610 (as shown in this example) and/or may be incorporated into various other systems 620-670.
In some embodiments, each of the systems 620-670 that collaborate and communicate with the platform server 610 may be implemented as client computing systems, such desktop or laptop computers, smartphones, tablet computers, and other various types of computing devices, each of which may include some or all of the hardware, software, and networking components discussed above. Specifically, any of client systems 620-670 may be implemented using any computing device with sufficient processing components, memory and software components, and I/O system components for interacting with users and supporting the desired set of communications with the platform server 610, as described herein. Accordingly, client systems 620-670 may include the necessary hardware and software components to establish the network interfaces, security and authentication capabilities, and capabilities for transmitting/receiving digital credential templates and digital credentials, digital credential data requests/responses to the platform server 610, etc. Each client system 620-670 may include an I/O subsystem, network interface controller, a processing unit, and memory configured to operate client software applications. The digital credential platform server 610 may be configured to receive and execute various programmatic and graphical interfaces for generating, managing, and tracking issued digital credentials, in collaboration with the various client systems 620-670. Accordingly, each client system 620-670 may include an I/O subsystem 611 having hardware and software components to support a specific set of output capabilities (e.g., LCD display screen characteristics, screen size, color display, video driver, speakers, audio driver, graphics processor and drivers, etc.), and a specific set of input capabilities (e.g., keyboard, mouse, touchscreen, voice control, cameras, facial recognition, gesture recognition, etc.). Different client systems 620-670 may support different input and output capabilities within their I/O subsystems, and thus different types of user interactions, and platform server 610 functionality may be compatible or incompatible with certain client systems 620-670. For example, certain types of digital credential generation and search functionality may require specific types of processors, graphics components, network components, or I/O components in order to be optimally designed and constructed using a client system 620-670.
In some embodiments, the digital credential platform server 610 may generate and provide software interfaces (e.g., via a web-based application, or using other programmatic or graphical interface techniques) used by the various client systems 620-670 to perform the various digital credential management functionality described herein. In response to receiving inputs from a client system 620-670 corresponding to digital credentials, templates, credential search requests and criteria, etc., the platform server 610 may access the underlying digital credential data store 615 perform the various functionality described herein. In other to perform the tasks described herein, platform server 610 may include components such as network interface controllers 612, processing units 613, and memory 614 configured to store server software, handle authentication and security, and to store, analyze, and manage the digital credentials, templates, and credential tracking data stored within the digital credential data store 615. As shown in this example, the digital credential data store 615 may be implemented as separate dedicated data stores (e.g., databases, file-based storage, etc.) used for storing digital credential template objects, issued digital credentials, credential tracking data, and authorized user/role data. The platform server 610 and data store 615 may be implemented as separate software (and/or storage) components within a single computer server 610 in some examples, while in other examples may be implemented as separate computer servers/systems having separate dedicated processing units, storage devices, and/or network components.
Certain aspects described herein related to the testing and certification processes used to verify the skills or qualifications that a user (or earner) has obtained in order to be awarded with a digital credential (or badge) or any other skill certification from an institution or credentialing body. In some embodiments, physical testing environments including “simulation laboratories” may use implemented to allow users to perform physical tasks (including mental and/or computer-based tasks) in a monitored environment. Such physical testing environments may use virtual reality and/or augmented reality in various cases. The simulation lab and/or the user may be monitored by various sensors during testing or certification processes, and the results may be analyzed to determine (at least in part) whether or not the user should be awarded a particular digital credential or certification. As discussed below in more detail, simulation labs may be implemented as testing environments for manual tasks, computer-based tasks, scenario training, etc., and various monitoring of the simulation lab environment during test may provide data metrics relating to successful completion of tasks, efficiency of task completion, user response times, user decision making behaviors, user biometrics and risk factors, etc. Further, as discussed below, certain simulation labs may provide the ability to change testing scenarios as well as environmental conditions (lighting, noise, temperature, etc.) during testing.
Referring now to
In addition to the testing equipment and apparatuses in the physical testing environment 700, the environment may have cameras 705 and sensors configured to monitor the performance and behavior of the user during the testing. As shown in this example, a number of cameras 705 may be installed throughout the testing environment 700 to capture image/video data of the user from different angles during the testing/skills verification process. In addition to cameras, in various embodiments (depending on the type of test or skill being evaluated), additional sensors may be deployed within the testing environment 700, including microphones, light sensors, heat sensors, vibration sensors, and any other sensor type, depending on the type of testing/evaluation being performed. For instance, for testing of computer-based tasks, additional sensors such as mouse movement trackers, keystroke loggers, and user eye-tracking software may be used. For machine usage tasks, scenario training, and the like, movement sensors may be placed on the user and/or on any objects with which the user may interact during the testing scenario. Additionally, for any testing or skills evaluation scenario, certain embodiments may include biometric sensors and devices 710 configured to detect and track the user's biometric data during the testing process. Such biometric sensors and devices may measure the user's temperature, heartrate, blood pressure, respiration, skin conductivity, body movements, brainwave activities, etc.
In some embodiments, the physical testing environment 700 also may include various environmental controls that allow a test administrator to control the physical environmental conditions during a test or skills evaluation. Such environmental controls may include lights 715 that allow the test administrator to control the light levels, angles, and/or colors during a test. By way of example, lighting control within the environment 700 may allow the test administrator to evaluate the user's ability to perform a driving maneuver or roadside maintenance task at night, etc. Additional environmental controls may include may include temperature controls, weather simulation (e.g., wind, rain, snow, sunshine, fog, etc.), speakers to provide background noise or distraction, olfactory control that provides scents/odors to simulate the smells that be present during a comparable real-life scenario, vibration control to simulate the activity, and so on.
Referring now to
In step 801, a computer server controlling the physical testing environment 700 may receive input relating to the test or skills evaluation scenario to be executed within the physical testing environment 700. In step 802, the server may receive data identifying the particular user designated to complete the test or skills evaluation scenario.
In step 803, the server may retrieve the test or scenario to be loaded/executed within the physical testing environment 700. As noted above, the test or scenario may include interactive user software (e.g., driving or flight simulator programs, law enforcement scenarios, etc.) and/or may include testing software or other software programs loaded onto a desktop, laptop, or tablet computer. For instance, the test or scenario may require the user to work with computer-aided design software, spreadsheet software, database development software, etc. In other cases, the test or scenario may include audio and/or video files to be played via speakers and/or display screens within the physical testing environment 700, such as instructional videos or audio/visual test questions.
The test or scenario retrieved in step 803 also may be retrieved based on the identity of the particular user who will be completing the test or skills evaluation scenario. In some embodiments, the server of the physical testing environment 700 may be configured to select the appropriate test, scenario, and/or simulation (e.g., a particular software scenario, skill level, etc.) based on the user's current set of badges or digital credentials, the user's skill level, and/or the user's performance history on previous tests or scenarios within the testing environment 700. Additionally, in some cases, the server may vary scenarios/test questions so that a particular user does not receive the same test questions, scenarios, or other testing content that they have already completed (or completed within a particular recent time window).
In step 804, the server may determine and apply a set of environmental conditions within the physical testing environment 700 for the execution of the test or scenario. As noted above, the physical testing environment 700 in some embodiments may be capable of setting various environment conditions such as lighting (e.g., to simulate different day or night, and/or different real-world working environments), temperature and weather conditions (e.g., to simulate outdoor scenarios, different seasons and locations), noise (e.g., to provide background noise, traffic noise, distractions, etc.) and other various environment conditions. The server may select and apply environmental conditions as part of the test or scenario selected in step 803, or as a separate determination which is performed based on random chance or selected by a test administrator, etc. For instance, for certain types of badges and other certifications, separate day and night testing of certain tasks may be required. In other cases, the environmental conditions may be selected randomly and changed for each testing session. In still other cases, user may select and/or save their preferred environmental conditions for different types of testing. Further, in some embodiments, the physical testing environment 700 may track and analyze the user's various testing or scenario performance metrics (e.g., accuracy, efficiency, safety, compliance, biometrics, etc.) under different environmental conditions, in order to determine the optimal environmental conditions for the particular user. In such cases, user's may receive different badges or certifications (or may have different badge assigned characteristics or endorsements) based on their test or scenario performance in different environmental conditions.
In step 805, the computer server(s) associated with the physical testing environment 700 may execute the test or simulation scenario, during which the user's performance and any/all user reactions or responses may be monitored. As noted above, even for certain tests that are entirely manual in nature, the physical testing environment 700 may use cameras and any other sensors to monitor the user's actions. Such monitoring may include various aspects of the user's performance, such as answers to test questions selected via a testing computer terminal, or the user's interactions with physical objects (and/or other people) within the physical testing environment 700. The user's answers and actions may be recorded by cameras and computer input devices, and additional user data may be collected using various other sensors such as microphones, biometric sensors, etc.
In step 806, the results for the test and/or simulation scenario completed by the user may be analyzed. In some embodiments, the such analyses may be performed based not only on the user's responses to particular test questions or scenarios. Additionally or alternatively, the analysis in step 806 may include an evaluation of the user's other reactions or responses, such as speed and confidence of action (e.g., as determined by user comments, speed of response, facial expression analysis, body movement analysis, biometric data, etc.), efficiency, safety, decision making, and user biometrics. One or more of these separate analyses may be performed in steps 807-810, and each may be performed independently of the others, or may be combined into a single analysis. For instance, in some cases the goal of the simulation might be only to measure the user's biometric data, and the user's actual responses to the questions/scenarios may be irrelevant and need not be evaluated in step 807. In other tests or simulation scenarios, the opposite analysis may be applied, where only the accuracy of the user's responses or behaviors are measured and analyzed in step 807, and the user's biometric data is irrelevant and thus the analysis in step 810 is not performed. As another example, in a certain simulation of driving, machine operation, use of force training, etc., the only relevant analysis to be performed may be a safety/decision making analysis in step 809, while the efficiency analysis in step 808 need not be performed. In other similar tests/situations, the server may apply both a safety/decision making analysis in step 809 and an efficiency analysis in step 808 (e.g., to confirm that a driving maneuver or route was completed both safely and efficiently, to confirm that a suspect was subdued safely and quickly, to assure that a manufacturing assembly task was performed safely and efficiently, etc.).
For example, in certain embodiments and implementations of the concepts discussed above in reference to
In some embodiments, outputting a physical simulation may include outputting audio and/or video simulation components within the physical simulation area, manipulating physical objects (e.g., motorized objects) during a live-action simulation within the physical simulation area, and/or outputting virtual reality simulations via a virtual reality headset. Additionally, certain embodiments may include generating physical simulation environments within a physical simulation area, including, for example, simulating ambient light conditions within the physical simulation area, outputting one or more background noise conditions within the physical simulation area, monitoring and controlling the physical temperature using a heating and cooling system installed at the physical simulation area, outputting smells to the physical simulation area using a smell output device, and/or outputting vibratory effect within the physical simulation area, using a vibration system.
As noted above, the monitoring of a test/simulation may include monitoring physical actions/activities performed by the user using video recording devices and/or motions. Additionally or alternatively, the monitoring may be of computer-based tasks, using additional software-based sensors such as mouse movement trackers, keystroke loggers, and user eye-tracking software, etc.
In accordance with certain aspects described herein, the processes used for testing/evaluating a user and determining that a user is eligible for a particular digital credential (or badge) need not include a specific test, designated evaluation, or scored scenario training. Rather, the testing and badging determinations may be performed automatically during the user's normal course of on-the-job performance of tasks. In such embodiments, the testing and credentialing of users may be based on observation of workers during their normal work activities. Cameras and other sensors may be installed and used to detect the completion of tasks and/or certain competencies of the users, and the data from these sensors may be evaluated to automatically determine when the user is eligible for a digital credential. Thus, on the job testing and badging may be performed entirely transparently to the worker performance of their job duties, and need not require any delay or distraction from job performance, or any designated time or location needed to perform formal testing.
In order to perform automatic and on-the-job testing and credentialing of workers or other users (e.g., students, athletes, etc.), the “work” environment of the user may be monitored with cameras and/or sensors capable of tracking the user's activities and performance. As discussed above with respect to the implementation of physical testing environments (e.g., 700), different types of digital credentials relate to different activities that may be performed in a variety of different work environments. Referring briefly to
Another example work environment is shown in
Referring now to
In step 1001, a computer server controlling the on-the-job badging system may activate the cameras, sensors, monitoring software, etc., within the workstation and/or work environment. As discussed above, this activation may include specific monitoring software to detect computer-based tasks, and/or location monitoring devices such as cameras, sensors, biometrics, etc., depending on the type of workers and work environments 900 being monitored. In some cases, an on-the-job testing and credentialing system may be implemented as an “always on” system, in which the workstation/workplace monitoring is constantly recording and analyzing worker activities. Thus, step 1001 may be optional in such embodiments. However, in other cases, workstation/workplace monitoring might only be activated at certain times and not others, for example, only during normal work hours, only on certain specific work days designated for work evaluation, etc. In some embodiments, a system administrator and/or individual workers may activate or de-activate the workstation/workplace monitoring systems within their work environment at any time. Thus, such systems need not be an invasion of privacy for any worker that does not choose for their work to be monitored and evaluated, but workers may choose to turn the monitoring systems on in order to be eligible for evaluation and earning of additional work related digital credentials and credentials.
In step 1002, the workstation/workplace monitoring systems may capture the user's work-related activities and behaviors, including performing various computer-based tasks and non-computer-based tasks as discussed above. In step 1003, the user's working data as collected by the workstation/workplace monitoring systems and sensors may be analyzed by the server, in order to determine in step 1004 whether or not the user is eligible for one or more digital credentials or other credentials (e.g., professional certifications, etc.) based on their on-the job work activities. Certain digital credentials or credentials may be made available to users in response to detecting that the user has successful completed one or more specialized work tasks, thus demonstrating that the user has obtained the particular skill associated with the digital credential. In some cases, the server and/or the monitoring systems and sensors may also be configured to detect a certain level of efficiency by the user in performing the tasks, and/or may require that the user perform a certain task N number of times before the user is eligible for the digital credential or credential.
In step 1004, if the system determines that the user is eligible for one or more particular digital credentials (1004:Yes), then in step 1005 the system may either issue the digital credential directly (e.g., if the workplace server is permitted to be digital credential issuer), and/or may initiate a communication session with a badging platform 610 and/or digital credential issuer 630 to request that a new digital credential is issued for the worker. In such examples, the workplace server may provide the information identifying the worker (e.g., name, employee ID, digital credential system profile ID, etc.) to a digital credential platform 610 or issuer 630, along with verification that the worker has completed the requirements to earn a particular digital credential. In some embodiments, the servers operating at the workplace may be configured to capture evidence (e.g., video evidence, screen captures, facial/identity verification, etc.) and transmit the evidence to the digital credential-issuing authority, before the digital credential may be issued.
In step 1006, the worker may be notified that they have received a digital credential based on their normal on-the-job activities. In some embodiments, the worker may indicate interest in obtaining one or more particular digital credentials, and the workstation/workplace monitoring system may be configured to evaluate the worker with respect to the particular digital credentials or credentials that the worker has expressed interest in. However, in other examples, it may be possible for a worker to receive an issued digital digital credential without expressing any interest in the digital credential (or even being aware of such a digital credential), but solely based on the determination that the worker has achieved the level of skills mastery required for the digital credential/credential, based on the automated monitoring of the worker within the workplace. In certain cases, a user may be informed that they are eligible for receiving a digital credential prior to the issuance of the digital credential in step 1005, and the user may be allowed to accept or reject the digital credential. Additionally, in some cases, the user may receive status reports (e.g., daily, weekly, etc.) identifying which digital credentials the user is being monitored for, and the user's progress with respect to earning those digital credentials. This data may include indications to the worker that he/she may earn a particular digital credential after performing a task another N times, or performing the task N amount faster, or performing the task without making any errors or backtracking, etc.
For example, in certain embodiments and implementations of the concepts discussed above in reference to
In such cases, the generated (or issued digital credential) may be embedded with additional data such as the evaluation/simulation time, location, or the sensor system/physical environment within which the evaluation/simulation was performed. Additionally, in some cases, facial recognition data and/or biometric data may be collected from the user (credential receiver), and may be used to validate or authenticate the digital credential by verifying the user's identity. As in the above examples, the monitoring may be done using physical movement tracking sensors such as video recorders and/or motion detectors, or may use software-based sensors such as network monitoring devices, keystroke loggers, mouse movement monitors, touch screen monitors. Such software-based tools may operate as background processes on a computer terminal being monitored, and/or may be built into specific software programs with which the user is interacting.
Additional aspects related to the automated tracking of user or worker activities, after the user/worker has been issued a badge (or digital credential), in order to determine how often the user/worker is “using” their digital credential. Depending on type of digital credential or credential, post-credentialing monitoring of the user may involve analysis of user's physical work product (e.g., documents produced, parts/items created, etc.), or may be involve observations of the user (e.g., via a workstation/workplace monitoring system). In order to evaluate how often a user is using a particular digital credential, a data store of digital credentials may be linked to particular skills, work-related, or activities. The user/worker may then be tracked to determine the number of such tasks performed, and/or the quality, efficiency, and/or competence of the user's performing those tasks, in order to determine to what extend the user/worker is “using” the digital credential.
Referring now to
In this example, system 1100 also includes a credential-to-activity mapping data store 1120, which may be implemented as a separate external data store and/or may be integrated into the digital credential data store of server 1100. The credential-to-activity mapping data store 1130 may include mappings of one or more tasks or activities associated with each digital credential type that a user may potentially earn. For example, a digital credential relating to automotive maintenance for a particular make of car may have associated activities and tasks that include particular maintenance tasks (e.g., tune-ups, part replacements, etc.) for different model cars having the make. As another example, an operating system administrator-related digital credential may list, within data store 1130, various system administrator tasks and that a user may perform on the particular operating system. In some cases, the activities or tasks associated with a particular digital credential may correspond to the same set of activities or tasks that a user is required to perform to earn the particular digital credential, and as discussed below, these activities or tasks may serve as a metric to evaluate how much the user is “using” the digital credential.
One or more workstation and/or workplace monitoring systems 1120 may provide user monitoring data to the server 1110, to allow the sever 1110 to analyze the user's activities and determine to what extent the user is using the activities and abilities associated with their digital credentials. In some embodiments, the workstation and/or workplace monitoring systems 1120 may be similar or identical to any of the workstation/workplace monitoring systems and sensors discussed above. For example, workplace monitoring systems 1120 may collect records detailing the user's physical work product (e.g., documents produced, modified or accessed by the user, inventory or work order records indicating tasks performed by the user, etc.). Additionally, workplace monitoring systems 1120 may include observation systems (e.g., workplace monitoring systems) including cameras and other sensors to track the user's activities and determine which specific tasks have been performed by the user.
In some embodiments, the monitoring and tracking of post-credentialing activities by the user may be used to analyze and provide digital credential or credential feedback data to various entities. For example, referring now to
In step 1202, the digital credential server 1110 and/or monitoring systems 1120 may monitor and track the activities of the credentialed user, including, for example, the workplace tasks performed by the user based on analyses of the various monitoring systems/sensor data installed at the user's workstation and/or workplace environment. As described above, determining what activities and tasks the credentialed user has performed, and when, may be performed using a variety of techniques. In some cases, determining what work-related tasks a user has performed, and what other activities they have been engaged in, may be done by analyses of written and electronic documents associated with the user or workplace. For instance, documents such as maintenance requests, work orders, customer tickets, purchase receipts, and the like may be analyzed to determine what activities or tasks the user has completed and when. For instance, a maintenance record listing the user as the assigned technician may be used in determination that the user has performed the specified task/activity at the time listed on the record. In other examples, the user's electronic mail and other electronic documents may be searched and analyzed (e.g., using a keyword analysis and/or trained artificial intelligence) to determine what tasks the user has performed and/or what activities the user has demonstrated during the relevant time periods. In some embodiments, there may be particular advantages in implementing a post-credentialing usage analysis and/or digital credential valuation process for certain digital credentials/tasks that are more discrete and detectable, for instance, a number of transmissions changed after earning a vehicle transmission certification, a number of particular medical procedures done following a digital credential credential for the procedure, a number of IT tickets resolved successfully following receiving an advanced IT computer services and computer repair digital credential, etc. In contrast, for other tasks and activities for which a user may receive a digital credential, such as leadership, communication skills, advanced C software programming, jujitsu skill levels, and the like, it may be more difficult to quantify if, when, and how often a user is using the particular skill or task associated with the digital credential.
In step 1203, a set of tasks and/or activities associated with the digital credentials obtained by the specific user may be retrieved using the credential-activity mapping data store 1130, and in step 1204 the retrieved tasks and/or activities may be compared to the tasks and activities that have been performed by the user subsequent to the digital credentials being earned (as determined in step 1202). As an example, the comparison in step 1204 may determine that in the six month since the user was issued a professional certification to perform a particular technical task, the user has performed that task on a weekly basis. Alternatively, for a different digital credential issued to the user directed to expertise in a particular software program, the comparison in step 1204 may determine that the user has used that software program only once since receiving the digital credential two years ago. In this case, the system may conclude that the professional certification issued six months ago to the user has been of greater usefulness than the software digital credential issued two years ago (allowing for the possibility of career changes, prestige-driven digital credentials rather than functional digital credentials, etc.).
In step 1205, data from the comparison of step 1204, i.e., data indicating the post-credentialing usage by the user of the digital credential-associated activities or tasks, may be aggregated and analyzed, and then transmitted to one or more of the relevant system components. In various embodiments, any of several different components and roles associated with the credentialing platform 1110 may request and receive this information for their associated digital credentials and/or associated users. For instance, digital credential owners and/or digital credential issuers may request and receive from the platform server 1110 data regarding the post-issuance usage of the digital credentials they own or have issued. In other cases, digital credential endorsers may request and receive from the platform server 1110 data regarding the post-issuance usage of the digital credentials they have endorsed. Digital credential earners, the users themselves also may request reports from the platform server 1110 quantifying the post-credentialing usage (which may be expressed in terms of time, value, and/or dollar amounts) associated with their previously earned digital credentials. Employers and other organizations also may request such reports for their employees or organization members, in order to determine which digital credentials have been the most used and most useful to the organization.
Referring now to
Steps 1301-1304 may correspond to steps 1201-1204 in some cases, and may be performed using similar or identical techniques to those discussed above. For example, in step 1301 a platform server 1110 and/or digital credential issuer may issue a digital credential associated with one or more activities or tasks to a particular user, recording the digital credential issuance data within the digital credential data store. In step 1302, the post-issuance activities of the particular user may be monitored, including monitoring of the user's work-related activities and tasks performed/completed, in order to determine the particular tasks and activities with which the user has been engaged following issuance of the digital credential. In step 1303, the skills, activities, and tasks associated with the user's digital credential(s) are retrieved, and in step 1304 are compared to the post-issuance user tasks and activities determined for the user in step 1302. Finally, in step 1305, based on the comparison in step 1304, the platform server 1110 may determine that an expiration date and/or recertification date associated with the user's digital credential should be adjusted based on the user's post-issuance activities. As an example, if the system determines in step 1305 that a user who received a digital credential corresponding to a forklift operator's license or commercial truck driving license three years ago, but has infrequently (or not at all) driven a forklift or a commercial truck since receiving their digital credential, then the system may determine that the user's license should expire at the earliest possible time (e.g., the expiration time as of when the digital credential was first issued). In contrast, if the system determines in step 1305 that the same user has frequently and consistently driven a forklift or a commercial truck ever since receiving their digital credential, and also that the user has a high-safety rating and/or high safety compliance scores, then the system may determine that the user's license may be extended. In such cases, the platform server 1110 may determine a new extended expiration or recertification time for the digital credential, update the user's digital credential record within the digital credential data store, and transmit notifications to the affected entities (e.g., the user, employer, digital credential issuer, digital credential owner, etc.) providing the new expiration date. In other examples, rather than changing the expiration date or recertification date of a digital credential (or eliminating the expiration altogether), the platform server 1110 may in other examples determine a new recertification course or procedure for the user, such as simple refresher course to allow the user to recertify quick than the longer complete recertification course used by other users with less post-credentialing digital credential usage.
Additional aspects described herein relate to capturing and using “evidence” data in connection with user testing and credentialing systems, on-the-job evaluation and badging systems, and/or post-credential monitoring systems. For example, within any automated badging/certification/verification system, evidence of the user's performance may be extracted and saved, for example, in a digital credential server along with an associated issued digital credential, or as part of a separate user portfolio of evidence. Evidence data may include, for example, audio and video of the user during a live simulation, or during a virtual reality or augment reality simulation, audio and keystroke data from the user during the testing processing, the user's reaction time and/or decision-making data during a split-second simulated scenario or relevant real-life event (e.g., a workplace accident, etc.), and/or any other sensor or biometric data collected during testing, credentialing, and/or monitoring. As discussed below, evidence data associated with a user may be saved with the user's digital credential and/or into a separate portfolio of evidence, which may be available to the user for review, and also may be provided upon request to potential employers for review during a review or hiring process. Such evidence data also may be applied to updated digital credential credentialing requirements, so that in some cases a user may simply resubmit their evidence portfolio instead of being required to recertify their digital credential when the test or credentialing standards are updated.
Referring now to
In this example, the platform server 1410 may receive data from three testing/credentialing systems 1421-1423. Similar to the above examples, the simulation lab system 1421 may correspond to a simulation lab or other physical testing environment, an on-the-job credentialing systems 1422 may include workstation/workplace monitoring systems and sensors to record and analyze the user's on-the-job performance, and may issue digital credentials in some cases without the need for any separate formal testing procedure; and post-credential monitoring systems 1423 may be configured to monitor users following the issuance of a digital credential, including tracking task performance data, skills usage, and the like, and comparing the data to the skills/tasks associated with the user's digital credentials.
In some embodiments, one or more systems 1421-1423 which perform user testing, credentialing, and/or monitoring, such as those systems discussed above, may capture and transmit “evidence data” of the user during a test, simulation, or during an on-the-job monitoring process. Evidence data may include, for example, video and/or audio of the user during a test, simulation (e.g., live, VR, or AR), collected by the sensors of a physical testing environment 700. Additional evidence data may include user reaction time data, decision-making data, facial expression and body language data, keystroke and mouse movement data, and/or user biometric data. The evidence data may correspond to a time period just before, during, and just after a test, simulation, or a task or activity performed during on-the-job monitoring.
As shown in this example, the various evidence data collected by systems 1421-1423 may be transmitted to the platform server 1410 and stored in an evidence portfolio data store. The evidence data collected by the testing, credentialing, and/or user monitoring systems may be associated with a particular user (or users) and with a particular digital credential (or digital credentials) that the user is in the process of earning or using (e.g., for post-credentialing monitoring). Thus, the evidence data may provide documented proof that the user actually completed the digital credential requirements, along with additional contextual evidence showing how the user performed during the testing, simulation, or monitoring.
Referring now to
In some cases, the platform server 1410 may determine a subset of the user activities matching digital credential requirements associated with the digital credential, wherein other user activities might bare no relevance to the requirements of the digital credential. In such cases, the platform server may store only the corresponding subset of the evidence/sensor data for the user activities matching digital credential requirements, and might not store other evidence/sensor data corresponding to irrelevant user activities upon which digital credentials do not depend.
Referring now to
In some examples, the request in step 1601 may be from the user himself/herself, who wants to review and study the evidence from his/her previous tests, simulations, and monitoring data. In other examples, the request in step 1601 may be from a current or potential employer, who has been authorized by the user to retrieve and view the user's evidence data associated with all work-relevant digital credentials, as part of a hiring process or review process. The user's evidence data may verify to the employer or potential employer that the user actually completed the digital credential requirements, and also may allow the employer or potential employer to observe the user's behaviors, responses, reactions first-hand, thus allowing them to evaluate the user's reaction time, efficiency, mental state, decision-making, etc., and other difficult to quantify characteristics. In still other examples, the user may authorize a digital credential issuer or digital credential owner to view the user's evidence files related to the digital credentials issued and owned by those entities. Finally, users may make some or all of their evidence data publicly available (e.g., on a file-by-file basis) and/or may actively post their evidence data as a multimedia file or data records within a digital credential profile page of the user that is maintained and published by the platform server 1410.
In some embodiments, in addition to (or instead of) providing evidence data in response to requests, the platform server 1410 may provide the functionality to receive updated tests, digital credential requirements, credentialing data, etc., and to apply a user's previously stored evidence to the new testing or credentialing requirements. For instance, in step 1604 of
In various embodiments, the updated testing/credentialing process in step 1604 may correspond to a re-issuance of a digital credential, with the same or updated requirements, or may corresponding to a different digital credential having similar and/or overlapping issuance requirements. For instance, the platform server 1410 may receive an updated set of requirements for a digital credential previously issued to a credential receiver, may retrieve the stored set of sensor data corresponding to the relevant activities performed by the credential receiver in connection with the issuance, may compare the retrieved set of sensor data/activities to the updated set of digital credential requirements, and then may generate/issue an updated digital credential to the credential receiver, based on the comparison of the retrieved set of sensor/activity data to the updated set of digital credential requirements. Similar techniques may be performed to generate and issue digital credentials to receivers for entirely different digital credentials, rather than updated credentials, such as similar credentials and/or credentials having overlapping eligibility requirements.
Additional aspects described herein relate to capturing and using user biometric data, physical user cues, and the like, in connection with user testing and credentialing systems, on-the-job evaluation and badging systems, and/or post-credential monitoring systems. For example, within any automated badging/certification/verification system, data identifying particular physical user cues and/or user biometric data may be collected during testing/simulation/monitoring processes and saved, for example, in a digital credential platform server along with an associated issued digital credential and/or the associated user. Physical user cues may include, for example, facial expressions, user reactions and/or noises made by the user during testing/simulations, user body language, eye movement, and any other user behavior or reaction detectable via cameras and external sensors. Additionally or alternatively, various types of user biometric data also may be collected during the testing, simulation, and/or monitoring processes performed on the user. Such biometric data may include, for instance, the user's temperature, heartrate, blood pressure, respiration, skin conductivity, and brainwave activity, and/or any known types of biometric data that may collected during testing, credentialing, and/or monitoring processes.
As discussed in more detail below, the user's physical cues and/or biometric data may be collected and saved within a digital credential platform server, and associated with the user, one or more particular digital credentials, and/or with the particular testing/simulation/monitoring processes during which the data was originally detected. Once collected, the data may be used to authenticate the testing, simulation, and/or monitoring processes, to confirm the user's identity and to prevent errors or fraudulent activities by users. The data may be saved with the user's digital credential and/or into a separate portfolio of evidence, which may be available to the user for review, and also may be provided upon request to potential employers for review during a review or hiring process. Such evidence data also may be applied to updated credentialing requirements, so that in some cases a user may simply resubmit their evidence portfolio instead of being required to recertify their digital credential when the test or credentialing standards are updated. In certain embodiments, the user's physical cues and/or biometric data also may be analyzed to determine the user's emotional states and reactions during the testing, simulation, and/or monitoring. Additionally or alternatively, the physical cues and biometric data may be detected for several users and analyzed collectively to provide feedback regarding the digital credential testing processes, simulations, monitoring, physical testing environments, etc.
Referring now to
In other embodiments, such as for certain on-the-job credentialing or monitoring systems, or for formal testing/credentialing when sophisticated high-tech physical testing environments are not used, the physical cue data and/or biometrics data collected may be limited by the cameras and sensors available. In some cases, a laptop camera or webcam installed at the user's workstation may be use to capture facial images and/or to recognize facial expressions at different times during the testing/monitoring. However, such cameras may or may not have the resolution and image capture capabilities to perform advanced facial expression monitoring, eye movement, and/or body language detection. In other examples, such as on-the-job credentialing and monitoring scenarios, facial images might only be detectable using lower-quality security cameras or the like that are configured to monitor an entire floor or workspace. In such examples, the facial images may be still be useful for certain purposes (e.g., confirmation of user identification), but potential may be unsuitable for facial expression analysis, eye movement analysis, and the like.
Additionally or alternatively, physical testing environments (e.g., simulation labs) and/or workstation or workplace monitoring systems may include various biometric sensors configured to detect biometric data of the user at different times during the test/simulation. As noted above, such biometric data may include the user's temperature, heartrate, blood pressure, respiration, skin conductivity, and brainwave activity, and/or any known types of biometric data. Thus, the biometric metric may be detected and captured via a combination of external sensors, wearable sensors, and/or implanted sensors in some cases. For on-the-job credentialing and monitoring, mobile wearable sensors such as heartrate monitors, step trackers, and the like, may be used when more advanced wearable sensors (e.g., blood pressure, respiration, skin conductivity, brainwave activity, etc.) are not practical.
Referring now to
In some embodiments, the platform server 1410 may use the physical cues and/or biometrics data collected for the user as part of an authentication process in step 1804. For example, during any testing/credentialing process (e.g., written testing, computer-based testing, simulation lab testing, etc.) the user's facial images, physical cues, and/or biometrics may be compared against previously stored corresponding data (e.g., user images, physical cue patterns, biometrics, etc.) in order to verify that the correct user is taking the test/simulation. Additionally, the user's physical cues and biometrics may provide an additional level of authentication, by comparing the observed physical cues and biometrics at particular times during the test or simulation to expected physical cues and biometrics, based on what is happening during the test or simulation at that particular time. For instance, a simulation may be designed to present a challenging and stressful situation to the user at a particular timestamp or within a sequence of tasks the user is performed. In step 1804, the server may compare the user's observed physical cues and biometrics to the physical cues and biometrics that would be expected for the challenging and stressful situation, in order to confirm that the data is valid and/or that the user did not expect this situation in advance (e.g., indicating cheating). In step 1805, the platform serving 1410 having validated the user's identity and the authenticity of the user's physical cues and biometrics, may store the testing, credentialing, monitoring data in the digital credential data store as valid data. In some embodiments, the image data, facial cues, and/or biometrics data also may be retained and stored by the platform server for future analysis.
In some embodiments, the data relating to the user's physical cues and biometrics collected during a test, simulation, or during on-the-job monitoring, may be further evaluated to identify the user's emotional states at different times. For instance, certain simulations may be specifically designed to invoke certain emotional states (e.g., anger, boredom, frustration, surprise, etc.), and the user's level of performance while experiencing those emotional states may be particularly important for certain testing/credentialing processes. In these examples and other cases, either exhibiting or not exhibiting particular emotion states may be an eligibility requirement for a credential receiver to obtain certain types of digital credentials. Thus, the data collected during the test, simulation, or monitoring in step 1801 may be used not only for user identification/authentication, but also may be analyzed to (1) determine the user's emotional state at different times during the test, simulation, or monitoring, (2) compare that emotional state to an expected emotional state based on what the user is experiencing, and (3) evaluate the user's reactions, levels of skills performance during different emotional states.
Additionally, in some embodiments, the physical cues, biometrics data, and/or emotional states detected for multiple users may be aggregated for the same tests, simulations, monitoring environments, etc. The aggregated data for tests may be used to revise current tests and simulations, design new tests and simulations, and for training users how to respond to particular scenarios and situations (e.g., workplace accidents).
A number of variations and modifications of the disclosed embodiments can also be used. Specific details are given in the above description to provide a thorough understanding of the embodiments. However, it is understood that the embodiments may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
Implementation of the techniques, blocks, steps and means described above may be done in various ways. For example, these techniques, blocks, steps and means may be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
Also, it is noted that the embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a swim diagram, a data flow diagram, a structure diagram, or a block diagram. Although a depiction may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in the figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.
Furthermore, embodiments may be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. When implemented in software, firmware, middleware, scripting language, and/or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as a storage medium. A code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, software codes may be stored in a memory. Memory may be implemented within the processor or external to the processor. As used herein the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
Moreover, as disclosed herein, the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information. The term “machine-readable medium” includes, but is not limited to portable or fixed storage devices, optical storage devices, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
While the principles of the disclosure have been described above in connection with specific apparatuses and methods, it is to be clearly understood that this description is made only by way of example and not as limitation on the scope of the disclosure.
This application is a non-provisional of and claims priority to U.S. Provisional Patent Application No. 62/559,433, entitled “DIGITAL CREDENTIAL PLATFORM,” filed Sep. 15, 2017, the entire contents of which are incorporated by reference herein for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4413277 | Murray | Nov 1983 | A |
4425097 | Owens | Jan 1984 | A |
4671772 | Slade et al. | Jun 1987 | A |
5103408 | Greenberg et al. | Apr 1992 | A |
5212635 | Ferriter | May 1993 | A |
5239460 | LaRoche | Aug 1993 | A |
5444226 | Collins, Jr. | Aug 1995 | A |
5535256 | Maloney et al. | Jul 1996 | A |
5748738 | Bisbee et al. | May 1998 | A |
5781732 | Adams | Jul 1998 | A |
5892900 | Ginter | Apr 1999 | A |
5909669 | Havens | Jun 1999 | A |
5963649 | Sako | Oct 1999 | A |
5980429 | Nashner | Nov 1999 | A |
6033226 | Bullen | Mar 2000 | A |
6056556 | Braun et al. | May 2000 | A |
6119097 | Barra | Sep 2000 | A |
6146148 | Stuppy | Nov 2000 | A |
6157808 | Hollingsworth | Dec 2000 | A |
6183259 | Macri et al. | Feb 2001 | B1 |
6389402 | Ginter et al. | May 2002 | B1 |
6431875 | Elliott et al. | Aug 2002 | B1 |
6513014 | Walker et al. | Jan 2003 | B1 |
6516411 | Smith | Feb 2003 | B2 |
6659038 | Calcagno | Dec 2003 | B2 |
6735574 | Bull | May 2004 | B2 |
6792394 | Matsko et al. | Sep 2004 | B1 |
6850252 | Hoffberg | Feb 2005 | B1 |
6871195 | Ryan et al. | Mar 2005 | B2 |
6948657 | Sugino et al. | Sep 2005 | B2 |
6973196 | Patton et al. | Dec 2005 | B2 |
6988199 | Toh et al. | Jan 2006 | B2 |
7043453 | Stefik et al. | May 2006 | B2 |
7089583 | Mehra et al. | Aug 2006 | B2 |
7097617 | Smith | Aug 2006 | B1 |
7099849 | Reeder et al. | Aug 2006 | B1 |
7110525 | Heller et al. | Sep 2006 | B1 |
7133845 | Ginter et al. | Nov 2006 | B1 |
7188138 | Schneider et al. | Mar 2007 | B1 |
7197161 | Fan | Mar 2007 | B2 |
7206765 | Gilliam et al. | Apr 2007 | B2 |
7206941 | Raley et al. | Apr 2007 | B2 |
7222086 | Huffman | May 2007 | B2 |
7237114 | Rosenberg | Jun 2007 | B1 |
7237144 | Safford et al. | Jun 2007 | B2 |
7277925 | Warnock | Oct 2007 | B2 |
7278168 | Chaudhury et al. | Oct 2007 | B1 |
7302634 | Lucovsky et al. | Nov 2007 | B2 |
7328245 | Hull et al. | Feb 2008 | B1 |
7340058 | Jakobsson et al. | Mar 2008 | B2 |
7353541 | Ishibashi et al. | Apr 2008 | B1 |
7392395 | Ginter et al. | Jun 2008 | B2 |
7441263 | Bakshi et al. | Oct 2008 | B1 |
7483670 | Walker et al. | Jan 2009 | B2 |
7587369 | Ginter et al. | Sep 2009 | B2 |
7596689 | Toh et al. | Sep 2009 | B2 |
7606401 | Hoffman et al. | Oct 2009 | B2 |
7630986 | Herz et al. | Dec 2009 | B1 |
7653556 | Rovinelli et al. | Jan 2010 | B2 |
7660902 | Graham et al. | Feb 2010 | B2 |
7660981 | Hunt | Feb 2010 | B1 |
7665141 | Young | Feb 2010 | B2 |
7676568 | Day | Mar 2010 | B2 |
7702531 | Draper et al. | Apr 2010 | B2 |
7725723 | Landrock et al. | May 2010 | B2 |
7743259 | Raley et al. | Jun 2010 | B2 |
7764772 | Weksel | Jul 2010 | B2 |
7769712 | Waldo et al. | Aug 2010 | B2 |
7793106 | Bugbee | Sep 2010 | B2 |
7805382 | Rosen et al. | Sep 2010 | B2 |
7817162 | Bolick et al. | Oct 2010 | B2 |
7860736 | Draper et al. | Dec 2010 | B2 |
7865937 | White et al. | Jan 2011 | B1 |
7881898 | Gedlinske et al. | Feb 2011 | B2 |
7970722 | Owen et al. | Jun 2011 | B1 |
8014992 | Smith | Sep 2011 | B2 |
8051289 | Johnson et al. | Nov 2011 | B2 |
8103634 | Saito | Jan 2012 | B2 |
8112391 | Allen et al. | Feb 2012 | B2 |
8182271 | Socher | May 2012 | B2 |
8190468 | Drew et al. | May 2012 | B1 |
8195657 | Dellovo | Jun 2012 | B1 |
RE43601 | Arseneau et al. | Aug 2012 | E |
8414387 | Paradise et al. | Apr 2013 | B1 |
8428926 | Choquet | Apr 2013 | B2 |
8443202 | White et al. | May 2013 | B2 |
8447272 | Faith et al. | May 2013 | B2 |
8457353 | Reville et al. | Jun 2013 | B2 |
8512043 | Choquet | Aug 2013 | B2 |
8535059 | Noble, Jr. | Sep 2013 | B1 |
8554685 | Patterson et al. | Oct 2013 | B2 |
8628331 | Wright | Jan 2014 | B1 |
8693737 | Newman et al. | Apr 2014 | B1 |
8694793 | Evans | Apr 2014 | B2 |
8714981 | Herman et al. | May 2014 | B2 |
8727782 | Brunacini et al. | May 2014 | B2 |
8764454 | Turner | Jul 2014 | B1 |
8819793 | Gottschalk, Jr. | Aug 2014 | B2 |
8826030 | White et al. | Sep 2014 | B2 |
8949608 | Hoornaert et al. | Feb 2015 | B2 |
8958606 | Hanna et al. | Feb 2015 | B2 |
9036871 | Hanna et al. | May 2015 | B2 |
9112730 | Dewaele et al. | Aug 2015 | B2 |
9141827 | Ho et al. | Sep 2015 | B2 |
9280706 | Hanna | Mar 2016 | B2 |
9300646 | Saylor | Mar 2016 | B1 |
9373002 | Johnson et al. | Jun 2016 | B2 |
9397838 | Chen | Jul 2016 | B1 |
9449300 | Kalscheuer | Sep 2016 | B2 |
9495526 | Hanna | Nov 2016 | B2 |
9509690 | Carter et al. | Nov 2016 | B2 |
9526443 | Berme et al. | Dec 2016 | B1 |
9589183 | Brown | Mar 2017 | B2 |
9646217 | Hanna | May 2017 | B2 |
9667427 | Oberhauser et al. | May 2017 | B2 |
9729411 | Purusothaman | Aug 2017 | B2 |
9729556 | Brock | Aug 2017 | B2 |
9786193 | Falash et al. | Oct 2017 | B2 |
9792659 | Ho et al. | Oct 2017 | B2 |
9858828 | Fuka | Jan 2018 | B1 |
9875665 | Beeson et al. | Jan 2018 | B2 |
9931539 | De Pablos et al. | Apr 2018 | B1 |
9990856 | Kuchenbecker et al. | Jun 2018 | B2 |
10025987 | Ackland et al. | Jul 2018 | B2 |
10032075 | Hanna | Jul 2018 | B2 |
10033536 | Mercury et al. | Jul 2018 | B2 |
10043229 | Hanna | Aug 2018 | B2 |
10051001 | Ashley | Aug 2018 | B1 |
10052026 | Tran | Aug 2018 | B1 |
10055733 | Hanna | Aug 2018 | B2 |
10068074 | Mercury et al. | Sep 2018 | B2 |
10086262 | Capper et al. | Oct 2018 | B1 |
10091017 | Landow et al. | Oct 2018 | B2 |
10101804 | Tennakoon et al. | Oct 2018 | B1 |
10114609 | Annett et al. | Oct 2018 | B2 |
10133856 | Nama et al. | Nov 2018 | B2 |
10142347 | Kurian | Nov 2018 | B2 |
10152141 | Lohse et al. | Dec 2018 | B1 |
10187394 | Bar et al. | Jan 2019 | B2 |
10198962 | Postlethwaite et al. | Feb 2019 | B2 |
10225522 | Kusens | Mar 2019 | B1 |
10231662 | Berme et al. | Mar 2019 | B1 |
10242501 | Pusch et al. | Mar 2019 | B1 |
10311299 | Gerber et al. | Jun 2019 | B2 |
10311300 | Teverovskiy | Jun 2019 | B2 |
10373523 | Fields et al. | Aug 2019 | B1 |
10388176 | Wallace et al. | Aug 2019 | B2 |
10402771 | De et al. | Sep 2019 | B1 |
10460083 | Baldwin et al. | Oct 2019 | B2 |
10460621 | Chica Barrera et al. | Oct 2019 | B2 |
10475351 | Horseman et al. | Nov 2019 | B2 |
10489526 | Sattigeri et al. | Nov 2019 | B1 |
10510267 | Jarc et al. | Dec 2019 | B2 |
10581828 | Kessler et al. | Mar 2020 | B2 |
10720074 | Postlethwaite et al. | Jul 2020 | B2 |
10783801 | Beaubien et al. | Sep 2020 | B1 |
10878714 | Liberatore et al. | Dec 2020 | B2 |
10885530 | Mercury et al. | Jan 2021 | B2 |
11004019 | Smith et al. | May 2021 | B2 |
11042885 | Mercury et al. | Jun 2021 | B2 |
20010032094 | Ghosh et al. | Oct 2001 | A1 |
20010047326 | Broadbent | Nov 2001 | A1 |
20020023140 | Hile et al. | Feb 2002 | A1 |
20020026574 | Watanabe | Feb 2002 | A1 |
20020026581 | Matsuyama | Feb 2002 | A1 |
20020037094 | Calcagno | Mar 2002 | A1 |
20020052896 | Streit et al. | May 2002 | A1 |
20020072946 | Richardson | Jun 2002 | A1 |
20020087861 | Segev et al. | Jul 2002 | A1 |
20020095389 | Gaines | Jul 2002 | A1 |
20020106622 | Osborne et al. | Aug 2002 | A1 |
20020128844 | Wilson et al. | Sep 2002 | A1 |
20020143818 | Roberts et al. | Oct 2002 | A1 |
20020157015 | Gilbert | Oct 2002 | A1 |
20030028494 | King et al. | Feb 2003 | A1 |
20030070072 | Nassiri | Apr 2003 | A1 |
20030120589 | Williams et al. | Jun 2003 | A1 |
20030182234 | Degroot | Sep 2003 | A1 |
20030187798 | McKinley et al. | Oct 2003 | A1 |
20030191653 | Birnbaum | Oct 2003 | A1 |
20030233563 | Kruse | Dec 2003 | A1 |
20040039704 | Gilliam et al. | Feb 2004 | A1 |
20040054893 | Ellis | Mar 2004 | A1 |
20040123111 | Makita et al. | Jun 2004 | A1 |
20040131999 | Dresnick | Jul 2004 | A1 |
20040133793 | Ginter et al. | Jul 2004 | A1 |
20040158476 | Blessinger et al. | Aug 2004 | A1 |
20040185931 | Lowell et al. | Sep 2004 | A1 |
20040220815 | Belanger et al. | Nov 2004 | A1 |
20040237035 | Cummins | Nov 2004 | A1 |
20040248071 | Bedziouk et al. | Dec 2004 | A1 |
20050027558 | Yamamoto | Feb 2005 | A1 |
20050027568 | Dorris | Feb 2005 | A1 |
20050048453 | Macri et al. | Mar 2005 | A1 |
20050060584 | Ginter | Mar 2005 | A1 |
20050080682 | Wilson | Apr 2005 | A1 |
20050130114 | Bantz | Jun 2005 | A1 |
20050177412 | Kemp | Aug 2005 | A1 |
20050182821 | Chan et al. | Aug 2005 | A1 |
20050222899 | Varadarajan et al. | Oct 2005 | A1 |
20050229258 | Pigin | Oct 2005 | A1 |
20050257253 | Ekers | Nov 2005 | A1 |
20050262339 | Fischer | Nov 2005 | A1 |
20050273621 | Davis | Dec 2005 | A1 |
20050288939 | Peled et al. | Dec 2005 | A1 |
20050289058 | Hoffman et al. | Dec 2005 | A1 |
20060039304 | Singer et al. | Feb 2006 | A1 |
20060095317 | Brown et al. | May 2006 | A1 |
20060095831 | Kawada et al. | May 2006 | A1 |
20060155636 | Hermann et al. | Jul 2006 | A1 |
20060180658 | Anderson | Aug 2006 | A1 |
20060282661 | True et al. | Dec 2006 | A1 |
20070006322 | Karimzadeh et al. | Jan 2007 | A1 |
20070038859 | Tadayon et al. | Feb 2007 | A1 |
20070074270 | Meehan et al. | Mar 2007 | A1 |
20070118735 | Cherrington et al. | May 2007 | A1 |
20070124584 | Gupta | May 2007 | A1 |
20070192140 | Gropper | Aug 2007 | A1 |
20070192173 | Moughler et al. | Aug 2007 | A1 |
20070192609 | Yoshioka et al. | Aug 2007 | A1 |
20070220614 | Ellis et al. | Sep 2007 | A1 |
20070226488 | Lin et al. | Sep 2007 | A1 |
20070289022 | Wittkotter | Dec 2007 | A1 |
20070294092 | Calannio | Dec 2007 | A1 |
20080005024 | Kirkwood | Jan 2008 | A1 |
20080014917 | Rhoads et al. | Jan 2008 | A1 |
20080027747 | McGovern et al. | Jan 2008 | A1 |
20080044801 | Modica et al. | Feb 2008 | A1 |
20080066181 | Haveson et al. | Mar 2008 | A1 |
20080071746 | Concordia et al. | Mar 2008 | A1 |
20080083025 | Meijer et al. | Apr 2008 | A1 |
20080091954 | Morris et al. | Apr 2008 | A1 |
20080106756 | Okamoto | May 2008 | A1 |
20080208646 | Thompson et al. | Aug 2008 | A1 |
20080208873 | Boehmer | Aug 2008 | A1 |
20080235175 | Olive | Sep 2008 | A1 |
20080235236 | Olive | Sep 2008 | A1 |
20090012433 | Fernstrom et al. | Jan 2009 | A1 |
20090049070 | Steinberg | Feb 2009 | A1 |
20090172777 | Hansen et al. | Jul 2009 | A1 |
20090234793 | Nishida | Sep 2009 | A1 |
20090299827 | Puri et al. | Dec 2009 | A1 |
20090299993 | Novack | Dec 2009 | A1 |
20100033739 | Phelan | Feb 2010 | A1 |
20100057487 | Heh et al. | Mar 2010 | A1 |
20100099060 | Bijou | Apr 2010 | A1 |
20100106645 | Peckover | Apr 2010 | A1 |
20100122093 | Tuyls et al. | May 2010 | A1 |
20100150448 | Lecerf et al. | Jun 2010 | A1 |
20100159434 | Lampotang et al. | Jun 2010 | A1 |
20100167248 | Ryan | Jul 2010 | A1 |
20100205649 | Becker | Aug 2010 | A1 |
20100217988 | Johnson | Aug 2010 | A1 |
20100332008 | Knipfer et al. | Dec 2010 | A1 |
20110022496 | Johnson et al. | Jan 2011 | A1 |
20110066490 | Bassin et al. | Mar 2011 | A1 |
20110165542 | Campbell et al. | Jul 2011 | A1 |
20110244440 | Saxon et al. | Oct 2011 | A1 |
20110279228 | Kumar | Nov 2011 | A1 |
20110281246 | Brunacini et al. | Nov 2011 | A1 |
20110320423 | Gemmell et al. | Dec 2011 | A1 |
20120034584 | Logan et al. | Feb 2012 | A1 |
20120059917 | Dawson et al. | Mar 2012 | A1 |
20120069131 | Abelow | Mar 2012 | A1 |
20120122062 | Yang et al. | May 2012 | A1 |
20120146789 | De Luca et al. | Jun 2012 | A1 |
20120237913 | Savitsky et al. | Sep 2012 | A1 |
20130011819 | Horseman | Jan 2013 | A1 |
20130012786 | Horseman | Jan 2013 | A1 |
20130012788 | Horseman | Jan 2013 | A1 |
20130012790 | Horseman | Jan 2013 | A1 |
20130012802 | Horseman | Jan 2013 | A1 |
20130013327 | Horseman | Jan 2013 | A1 |
20130063432 | Kaps et al. | Mar 2013 | A1 |
20130086484 | Antin et al. | Apr 2013 | A1 |
20130097093 | Kolber et al. | Apr 2013 | A1 |
20130117400 | An | May 2013 | A1 |
20130128022 | Bose et al. | May 2013 | A1 |
20130137066 | Pollak et al. | May 2013 | A1 |
20130189656 | Zboray et al. | Jul 2013 | A1 |
20130203509 | Reed et al. | Aug 2013 | A1 |
20130210406 | Vidal et al. | Aug 2013 | A1 |
20130251214 | Chung | Sep 2013 | A1 |
20130281079 | Vidal et al. | Oct 2013 | A1 |
20130311244 | Abotchie | Nov 2013 | A1 |
20130317791 | Danielson | Nov 2013 | A1 |
20130340058 | Barnes et al. | Dec 2013 | A1 |
20140006615 | Karnik et al. | Jan 2014 | A1 |
20140039956 | Cicio, Jr. | Feb 2014 | A1 |
20140045589 | Paradise et al. | Feb 2014 | A1 |
20140101264 | Dewaele et al. | Apr 2014 | A1 |
20140122355 | Hardtke et al. | May 2014 | A1 |
20140129467 | Vianello | May 2014 | A1 |
20140162224 | Wallace et al. | Jun 2014 | A1 |
20140163333 | Horseman | Jun 2014 | A1 |
20140173748 | Esmailzdeh | Jun 2014 | A1 |
20140195312 | Ansel et al. | Jul 2014 | A1 |
20140201345 | Abuelsaad et al. | Jul 2014 | A1 |
20140205990 | Wellman et al. | Jul 2014 | A1 |
20140207534 | Chakra | Jul 2014 | A1 |
20140240507 | Hsu et al. | Aug 2014 | A1 |
20140278821 | McConnell | Sep 2014 | A1 |
20140279587 | Gafford | Sep 2014 | A1 |
20140282868 | Sheller et al. | Sep 2014 | A1 |
20140304181 | Kurien et al. | Oct 2014 | A1 |
20140304787 | Kurien et al. | Oct 2014 | A1 |
20140309849 | Ricci | Oct 2014 | A1 |
20140315164 | Jones et al. | Oct 2014 | A1 |
20140330412 | Bjarnason | Nov 2014 | A1 |
20140348396 | Laaser et al. | Nov 2014 | A1 |
20140349255 | Watt et al. | Nov 2014 | A1 |
20140353369 | Malin et al. | Dec 2014 | A1 |
20140369602 | Meier et al. | Dec 2014 | A1 |
20140376876 | Bentley et al. | Dec 2014 | A1 |
20150037781 | Breed et al. | Feb 2015 | A1 |
20150050623 | Falash et al. | Feb 2015 | A1 |
20150052075 | Jayadevan et al. | Feb 2015 | A1 |
20150056582 | Selvaraj | Feb 2015 | A1 |
20150059003 | Bouse | Feb 2015 | A1 |
20150066612 | Karpoff et al. | Mar 2015 | A1 |
20150066792 | Sprague | Mar 2015 | A1 |
20150067811 | Agnew et al. | Mar 2015 | A1 |
20150079545 | Kurtz | Mar 2015 | A1 |
20150095999 | Toth | Apr 2015 | A1 |
20150104757 | Moncrief et al. | Apr 2015 | A1 |
20150127565 | Chevalier et al. | May 2015 | A1 |
20150164409 | Benson et al. | Jun 2015 | A1 |
20150187224 | Moncrief | Jul 2015 | A1 |
20150196804 | Koduri et al. | Jul 2015 | A1 |
20150196805 | Koduri et al. | Jul 2015 | A1 |
20150200935 | Ikeda et al. | Jul 2015 | A1 |
20150229623 | Grigg et al. | Aug 2015 | A1 |
20150242797 | Hoanca et al. | Aug 2015 | A1 |
20150242979 | Abts | Aug 2015 | A1 |
20150249661 | Cauthen | Sep 2015 | A1 |
20150302769 | Johnson | Oct 2015 | A1 |
20150318993 | Hamlin et al. | Nov 2015 | A1 |
20150364017 | Hall et al. | Dec 2015 | A1 |
20150375104 | Nishar et al. | Dec 2015 | A1 |
20160004862 | Almehmadi et al. | Jan 2016 | A1 |
20160059136 | Ferris | Mar 2016 | A1 |
20160063314 | Sarnet | Mar 2016 | A1 |
20160161468 | Keays et al. | Jun 2016 | A1 |
20160163217 | Harkness | Jun 2016 | A1 |
20160180248 | Regan | Jun 2016 | A1 |
20160188765 | Vossler et al. | Jun 2016 | A1 |
20160203732 | Wallace et al. | Jul 2016 | A1 |
20160248598 | Lin et al. | Aug 2016 | A1 |
20160248759 | Tsurumi et al. | Aug 2016 | A1 |
20160253486 | Sarkar | Sep 2016 | A1 |
20160253710 | Publicover et al. | Sep 2016 | A1 |
20160267292 | Johnson et al. | Sep 2016 | A1 |
20160322078 | Bose et al. | Nov 2016 | A1 |
20160323173 | Bivens et al. | Nov 2016 | A1 |
20160356751 | Blackley | Dec 2016 | A1 |
20160360791 | Blackley | Dec 2016 | A1 |
20160361452 | Blackley | Dec 2016 | A1 |
20160361677 | Blackley | Dec 2016 | A1 |
20160361678 | Blackley | Dec 2016 | A1 |
20160361972 | Blackley | Dec 2016 | A1 |
20160363332 | Blackley | Dec 2016 | A1 |
20160363339 | Blackley | Dec 2016 | A1 |
20160363567 | Blackley | Dec 2016 | A1 |
20160363570 | Blackley | Dec 2016 | A1 |
20160363572 | Blackley | Dec 2016 | A1 |
20160363582 | Blackley | Dec 2016 | A1 |
20160363917 | Blackley | Dec 2016 | A1 |
20160367925 | Blackley | Dec 2016 | A1 |
20160367926 | Blackley | Dec 2016 | A1 |
20160367927 | Blackley | Dec 2016 | A1 |
20160370335 | Blackley | Dec 2016 | A1 |
20160370337 | Blackley | Dec 2016 | A1 |
20170005868 | Scheines et al. | Jan 2017 | A1 |
20170020195 | Cameron | Jan 2017 | A1 |
20170031449 | Karsten et al. | Feb 2017 | A1 |
20170032248 | Dotan-Cohan et al. | Feb 2017 | A1 |
20170054702 | Turgeman | Feb 2017 | A1 |
20170132464 | Brown | May 2017 | A1 |
20170139762 | Sherlock et al. | May 2017 | A1 |
20170147801 | Hamlin et al. | May 2017 | A1 |
20170148340 | Popa-Simil et al. | May 2017 | A1 |
20170154307 | Maurya et al. | Jun 2017 | A1 |
20170154310 | Duerr et al. | Jun 2017 | A1 |
20170154539 | King et al. | Jun 2017 | A1 |
20170176127 | Ferris | Jun 2017 | A1 |
20170193839 | Breed | Jul 2017 | A1 |
20170193845 | Cardonha et al. | Jul 2017 | A1 |
20170206064 | Breazeal et al. | Jul 2017 | A1 |
20170206567 | Sutton-Shearer | Jul 2017 | A1 |
20170263142 | Zereshkian et al. | Sep 2017 | A1 |
20170272427 | Robison et al. | Sep 2017 | A1 |
20170278417 | Ur et al. | Sep 2017 | A1 |
20170279614 | Mercury et al. | Sep 2017 | A1 |
20170289168 | Bar | Oct 2017 | A1 |
20170323244 | Rani et al. | Nov 2017 | A1 |
20170344927 | Coletta et al. | Nov 2017 | A1 |
20170357928 | Ross et al. | Dec 2017 | A1 |
20170361213 | Goslin et al. | Dec 2017 | A1 |
20170372249 | Abraham et al. | Dec 2017 | A1 |
20180040256 | Alvarez et al. | Feb 2018 | A1 |
20180075229 | Jan | Mar 2018 | A1 |
20180083986 | Hurley | Mar 2018 | A1 |
20180095613 | Ready | Apr 2018 | A1 |
20180096306 | Wang et al. | Apr 2018 | A1 |
20180101806 | Adepoju | Apr 2018 | A1 |
20180129790 | Nama et al. | May 2018 | A1 |
20180143757 | Champion et al. | May 2018 | A1 |
20180144108 | Sawai et al. | May 2018 | A1 |
20180144541 | Champion et al. | May 2018 | A1 |
20180173871 | Toth | Jun 2018 | A1 |
20180197078 | Khan | Jul 2018 | A1 |
20180203238 | Smith, Jr. et al. | Jul 2018 | A1 |
20180225982 | Jaeh et al. | Aug 2018 | A1 |
20180268341 | Rini et al. | Sep 2018 | A1 |
20180284453 | Irvin et al. | Oct 2018 | A1 |
20180341901 | Shike | Nov 2018 | A1 |
20190025905 | Godina et al. | Jan 2019 | A1 |
20190028492 | Coleman et al. | Jan 2019 | A1 |
20190051046 | Jin et al. | Feb 2019 | A1 |
20190051199 | Corbett | Feb 2019 | A1 |
20190087558 | Mercury et al. | Mar 2019 | A1 |
20190089701 | Mercury et al. | Mar 2019 | A1 |
20190090816 | Horseman | Mar 2019 | A1 |
20190114940 | Gobert et al. | Apr 2019 | A1 |
20190124471 | Chelnick | Apr 2019 | A1 |
20190207932 | Bud et al. | Jul 2019 | A1 |
20190276037 | Ito et al. | Sep 2019 | A1 |
20190295101 | Porter et al. | Sep 2019 | A1 |
20200118456 | Breed et al. | Apr 2020 | A1 |
20200126444 | Fu et al. | Apr 2020 | A1 |
20200160180 | Lehr et al. | May 2020 | A1 |
20200279464 | Llewelyn | Sep 2020 | A1 |
Number | Date | Country |
---|---|---|
2008030759 | Mar 2008 | WO |
2008030759 | Mar 2008 | WO |
Entry |
---|
Alicoding et al, “BadgeKit API”, retrieved from https://github.com/mozilla/badgekit-api, Aug. 25, 2015, 3 pages. |
Brianloveswords, “Authorization” retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/authorization.md, Mar. 3, 2014, 2 pages. |
Chamilo/Chamilo-IMS, “Get badges when the user has achieved skills-ref BT#9082”, Feb. 12, 2015, downloaded from https://github.com/chamilo/chamilo-Ims/commit/15d7e22521aa752f9a96840ae57d493250533671 on Feb. 16, 2018, 2 pages. |
Grant, Sheryl L.—What Counts as Learning: Open Digital Badges for New Opportunities, Aug. 31, 2014, retrieved on Apr. 13, 2017, retrieved from the Internet <URLhttps://dmlhub.net/wp-content/uploads/files/WhatCountsAsLearning_Grant.pdf>, entire document. |
Lester, Dave “Assertion Specification Changes”, Mozilla/openbadges-backpack, , last edited Apr. 16, 2013, 1 revision, https://github.com/mozilla/opengadges-backpack/wiki/Assertion-Specification-Changes, retrieved Nov. 28, 2017, all pages. |
Mozilla, Open Badges Backpack-ng (Next Generation), Jun. 2012, https://github.com/mozilla/openbadges-backpack, retrieved Nov. 28, 2017, pp. 1-5. |
Mozilla, LRNG, IMS Global Learning Consortium, “Developers Guide”, copyright 2016, http://opengadges.org/developers/, retrieved Nov. 28, 2017, pp. 1-12. |
Mozilla, LRNG, IMS Global Learning Consortium, “What's an Open Badge?”, https://opengadebs.ort/get-started/, retrieved Nov. 28, 2017, all pages. |
Open Badges Specification, v1.1, May 1, 2015, retrieved on Apr. 12, 2017, retrieved from the Internet <URL: https://openbadgespec.org/history/1.1-specification.html >, entire document. |
Otto, Nate; Gylling, Markus, Editors, “Open Badges v2.0 IMS Candidate Final / Public Draft”, IMS Global Learning Consortium, Mar. 8, 2017, htpp://www.imsglobal.org/Badges/OBv2p0/index.html, retrieved Nov. 28, 2017, pp. 1-17. |
Pearson, “Open badges are unlocking the emerging jobs economy”, 2013, pp. 1-7. |
Smith; Sue, “Assessment”, retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/assessment.md, Jun. 18, 2014, 19 pages. |
Smith; Sue, “Badgekit API Documentation”, retrieved from https://github.com/mozilla/badgekitapi/blob/master/docs/README.md, Jun. 13, 2014, 2 pages. |
Smith; Sue, “Issuing”, retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/issuing.md, Jun. 24, 2014, 7 pages. |
Smith; Sue, et al., “API Endpoints” retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/api-endpoints.md, Jul. 31, 2014, 4 pages. |
Smith; Sue, et al., “Badges” retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/badges.md, Jul. 31, 2014, 20 pages. |
Smith; Sue, et al., “Claim Codes”, retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/claim-codes.md, Jun. 18, 2014, 11 pages. |
Smith; Sue, et al., “Issuers”, retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/issuers.md, Jun. 16, 2014, 9 pages. |
Smith; Sue, et al., “Milestones”, retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/milestones.md, Jun. 16, 2014, 15 pages. |
Smith; Sue, et al., “Programs” retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/programs.md, Jun. 16, 2014, 8 pages. |
Smith; Sue, et al., “SystemCallbacks (Webhooks)”, retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/webhooks.md, Jun. 13, 2014, 4 pages. |
Smith; Sue, et al., “Systems” retrieved from https://github.com/mozilla/badgekit-api/blob/master/docs/systems.md, Jun. 16, 2014, 9 pages. |
The Badge Alliance Standard Working Group “Open Badges Technical Specification”, published May 1, 2015, http://web.archive.org/web/20160426204303 /https://openbadgespec.org/ retrieved Nov. 28, 2017, pp. 1-9. |
The Mozilla Foundation and Peer 2 Peer University in Collaboration With the MacArthur Foundation, “Open Badges for Lifelong Learning”, updated Aug. 27, 2012, all pages. |
The Mozilla Foundation and PEER 2 Peer University, in collaboration with The MacArthur Foundation, “Open Badges for Life Long Learning”, Jan. 23, 2012, 14 pages. |
Thompson, Matt, “Introducing Open Badges 1.0”, The Mozilla Blog, Mar. 14, 2013 https://blog.mozilla.org/blog/2013/03/14/open_badges/, retrieved Nov. 29, 2017 pp. 1-11. |
PCT/US2017/018817 received an International Search Report and Written Opinion dated May 5, 2017, all pages. |
PCT/US2017/018821 received an International Search Report and Written Opinion dated May 5, 2017, all pages. |
U.S. Appl. No. 15/081,173 received a Notice of Allowance dated Mar. 14, 2018, 5 pages. |
U.S. Appl. No. 15/081,215 received a First Action Interview Office Action dated Jan. 26, 2018, 5 pages. |
The Mozilla Foundation and Peer 2 Peer University, in collaboration with The MacArthur Foundation, “Open Badges for Lifelong Learning”, updated Aug. 27, 2012, pp. 1-14. Retrieved on Dec. 20, 2018, from URL: https://wiki.mozilla.org/images/5/59/OpenBadges-Working-Paper_012312.pdf. |
PCT/US2017/018817 received an International Search Report and Written Opinion dated May 3, 2017, 12 pages. |
PCT/US2018/049767 received an International Search Report and Written Opinion dated Nov. 28, 2018, 15 pages. |
Ross, “Drive Operator Excellence through Simulator Training”, Honeywell Training Manual, Honeywell users Group Europe, Middle East and Africa, 2013, 31 pp. |
Mayberry, Charles Randall, Toward the Implementation of Augmented Reality Training NOVA Southeastern University, 2013. |
Teaching and Testing in Flight simulation training Devices (FSTD), European Union Aviation Safety Agency (EASA), Dec. 18, 2015. |
Wuster, Mario et al., How to integrate and automatically issue Open Badges in MOOC Platforms, University of Technology, 2016. |
Teaching and Testing in Flight simulation training Devices (FSTD), European Union Aviation Safety Agency (EASA), Dec. 18, 2015 (Year: 2015). |
Number | Date | Country | |
---|---|---|---|
20190087829 A1 | Mar 2019 | US |
Number | Date | Country | |
---|---|---|---|
62559433 | Sep 2017 | US |