Various embodiments disclosed herein address technical challenges related to performing predictive data analysis operations, and address efficiency and reliability shortcomings of various existing predictive data analysis solutions, in accordance with at least some of the techniques described herein.
In general, embodiments disclosed herein provide methods, apparatuses, systems, computing devices, computing entities, and/or the like for performing predictive data analysis operations for predictive contribution determinations for various entities. For example, certain embodiments disclosed herein utilize systems, methods, and computer program products that perform predictive data analysis operations for an entity based on a per-candidate feature contribution score for the entity using a predictive analysis machine learning model.
An example embodiment may be used to explain adverse actions (AA) which may occur when a financial institution declines an application for credit for an entity, which in some situations may be a customer. Adverse actions may include i) refusals to grant credit in the amount or terms requested in the credit application, ii) a termination of an account or unfavorable change in terms of a corresponding account, iii) a refusal to increase the amount of credit available to an applicant, etc. The adverse action may be determined based on a corresponding entity score for the entity, which may be determined by a predictive analysis machine learning model. For example, an adverse action may be determined for an entity when the corresponding entity score fails to satisfy a determination decision threshold. The corresponding entity score may be large, which may indicate a high likelihood of default. The determination decision threshold may control the values or range of values which are acceptable (e.g., not associated with a high probability of default). In the event that an adverse action is determined for the entity, there may be a legal requirement to provide the entity with an explanation of why such an adverse action was determined. The predictive analysis machine learning model may further be configured to generate and/or provide a predictive contribution report which may be indicative of an explanation for reasons the adverse action was determined. The predictive analysis machine learning model may use Baseline Shapley techniques (e.g., Shapley decomposition) to determine the reason(s) for the adverse action, such as by determining a per-candidate feature contribution score for each candidate feature.
The foregoing brief summary is provided merely for purposes of summarizing some example embodiments described herein. Because the above-described embodiments are merely examples, they should not be construed to narrow the scope of this disclosure in any way. It will be appreciated that the scope of the present disclosure encompasses many potential embodiments in addition to those summarized above, some of which will be described in further detail below.
Having described certain example embodiments in general terms above, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale. Some embodiments may include fewer or more components than those shown in the figures.
Some example embodiments will now be described more fully hereinafter with reference to the accompanying figures, in which some, but not necessarily all, embodiments are shown. Because innovations described herein may be embodied in many different forms, the innovation should not be limited solely to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements.
The term “computing device” is used herein to refer to any one or all of programmable logic controllers (PLCs), programmable automation controllers (PACs), industrial computers, desktop computers, personal data assistants (PDAs), laptop computers, tablet computers, smart books, palm-top computers, personal computers, smartphones, wearable devices (such as headsets, smartwatches, or the like), and similar electronic devices equipped with at least a processor and any other physical components necessarily to perform the various operations described herein. Devices such as smartphones, laptop computers, tablet computers, and wearable devices are generally collectively referred to as mobile devices.
Various embodiments disclosed herein relate to determining a predictive action to take for an entity based on associated per-candidate feature contribution scores generated for the entity using a predictive analysis machine learning model, thereby also providing interpretability of otherwise black-box outputs generated by the predictive analysis machine learning model. While the use of such machine learning techniques may allow for consideration of a wide range of entity features and associated increased predictive accuracy, such techniques often lack interpretability. For example, financial institutions may use machine learning techniques, either alone or in tandem with manual review, to determine whether to approve a customer's associated credit application. While use of such models aid in the accuracy of these decisions, compliance regulations may dictate that denial decisions be supplemented with reasons for decline, which may be complicated by a lack of insight into the dynamically weighted features of such models.
An example implementation may be used to explain adverse actions (AA) which occur when a financial institution declines an application for credit for an entity (e.g., a customer). Adverse actions may include i) refusals to grant credit in the amount or terms requested in the credit application, ii) a termination of an account or unfavorable change in terms of a corresponding account, iii) a refusal to increase the amount of credit available to an applicant, etc. The adverse action may be determined based on a corresponding entity score for the entity, which may be determined by a predictive analysis machine learning model. For example, an adverse action may be determined for an entity when the corresponding entity score fails to satisfy a determination decision threshold. The corresponding entity score may be large, which may indicate a high likelihood of default. The determination decision threshold may control the values or range of values which are acceptable (e.g., not associated with a high probability of default). In the event that an adverse action is determined for the entity, there may be a legal requirement to provide the entity with an explanation of why such an adverse action was determined. The predictive analysis machine learning model may further be configured to generate and/or provide a predictive contribution report. The predictive contribution report may indicate an explanation of the reasons that the adverse action was determined. The predictive analysis machine learning model may use Baseline Shapley techniques (e.g., Shapley decomposition) to determine the reason(s) for the adverse action, such as by determining a per-candidate feature contribution score for each candidate feature.
To address the above-noted technical challenges, various embodiments disclosed herein describe a predictive analysis machine learning model configured to generate an entity score, as well as a predictive contribution report based on each per-candidate feature contribution score for each candidate feature for an entity. Each candidate feature may be descriptive of a considered parameter used in the predictive analysis machine learning model and its impact to the entity score. Thus, the predictive analysis machine learning model may provide for an accurate entity score determination while also providing for interpretability of the impact of each candidate feature considered by said model.
Various embodiments disclosed herein also address technical challenges for efficient per-candidate feature contribution score determinations in real-time by introducing techniques that enable utilizing an existing reference entity selected based on one or more dynamically customizable reference determination decision thresholds, in part to determine the per-candidate feature contribution score for each candidate feature. By using an existing reference entity, which may satisfy the one or more reference determination decision thresholds, the predictive analysis machine learning model may reduce the computational complexity of runtime operations that may be associated with processing of a non-existent reference entity for use by the predictive analysis machine learning model. Additionally, the one or more dynamically customizable reference determination decision thresholds may be customized to a user, institution, regulatory body, etc. specifications, thereby allowing for controllability with respect to determination of each per-candidate feature contribution score. For example, in some embodiments, it may be advantageous to select a reference entity such that a reference entity score (of the reference entity) is near a maximum reference entity score. Alternatively, in some embodiments, it may be advantageous to select a reference determination decision threshold such that a reference entity with a corresponding reference entity score near the value of the determination decision threshold is selected. In yet another embodiment, it may be advantageous to select a reference determination decision threshold near the value of the determination decision threshold, but with an additional buffer score (e.g., a reference entity score 10%-15% above the determination decision threshold value).
Furthermore, in some embodiments, the predictive analysis machine learning model may determine a pairwise feature correlation score for each pair of candidate features, where, for example, each possible pair of candidate features from the set of candidate features is considered. In the event the pairwise feature correlation score for a pair of candidate features satisfies one or more feature correlation thresholds, the predictive analysis machine learning model may determine the per-candidate feature contribution score for the candidate features together. For example, if candidate features 1 and 5, candidate features 1 and 7, and candidate features 5 and 7 are determined to each have a pairwise feature correlation score which satisfies the one or more feature correlation thresholds, candidate features 1, 5, and 7 may be considered together when determining the per-candidate feature contribution score. As such, correlated candidate features may be considered in aggregate, thereby advantageously allowing for improved computational efficiency of computer-implemented modules that perform operations corresponding to the predictive analysis machine learning model. The predictive analysis machine learning model may therefore generate per-candidate feature contribution scores while reducing the computational complexity of runtime operations, thus resulting in a more time efficient and less computationally resource-intensive method to generate a predictive contribution report for the entity.
Additionally, various embodiments disclosed herein make important technical contributions to improving resource-usage efficiency of post-prediction systems by using generated entity scores to set the number of allowed computing entities used by post-prediction systems, and thus perform operational load balancing for post-prediction systems. For example, a predictive data analysis computing entity may determine entity scores for N entities. Of the N entity scores, M entity scores may satisfy the determination decision threshold while L entity scores may fail to satisfy the determination decision threshold, where the count of M plus L is equivalent to N. As the L entity scores which did not satisfy the determination decision threshold may require increased computational time from the associated predictive data analysis computing entity, it may be advantageous to distribute additional processing requests to predictive data analysis computing entities where the count of entity scores which did not satisfy the determination decision threshold is less than the count L. This may be done by dynamically allocating and de-allocating computing entities to the post-prediction processing operations based on the number of entity scores which satisfy the determination decision threshold.
Thorough analyses on both simulated data and public real data demonstrate both of these results. In addition, as further disclosed herein, it is possible to increase interpretability of a generated entity score for an entity based on the entity score and a selected reference entity.
Although a high-level explanation of the operations of example embodiments has been provided above, specific details regarding the configuration of such example embodiments are provided below.
Embodiments disclosed herein may be implemented in various ways, including as computer program products that comprise articles of manufacture. Such computer program products may include one or more software components including, for example, software objects, methods, data structures, or the like. A software component may be coded in any of a variety of programming languages. An illustrative programming language may be a lower-level programming language such as an assembly language associated with a particular hardware framework and/or operating system platform. A software component comprising assembly language instructions may require conversion into executable machine code by an assembler prior to execution by the hardware framework and/or platform. Another example programming language may be a higher-level programming language that may be portable across multiple frameworks. A software component comprising higher-level programming language instructions may require conversion to an intermediate representation by an interpreter or a compiler prior to execution.
Other examples of programming languages include, but are not limited to, a macro language, a shell or command language, a job control language, a script language, a database query or search language, and/or a report writing language. In one or more example embodiments, a software component comprising instructions in one of the foregoing examples of programming languages may be executed directly by an operating system or other software component without having to be first transformed into another form. A software component may be stored as a file or other data storage construct. Software components of a similar type or functionally related may be stored together such as, for example, in a particular directory, folder, or library. Software components may be static (e.g., pre-established or fixed) or dynamic (e.g., created or modified at the time of execution).
A computer program product may include non-transitory computer-readable storage medium storing applications, programs, program modules, scripts, source code, program code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like (also referred to herein as executable instructions, instructions for execution, computer program products, program code, and/or similar terms used herein interchangeably). Such non-transitory computer-readable storage media include all computer-readable media (including volatile and non-volatile media).
In one embodiment, a non-volatile computer-readable storage medium may include a floppy disk, flexible disk, hard disk, solid-state storage (SSS) (e.g., a solid state drive (SSD), solid state card (SSC), solid state module (SSM), enterprise flash drive, magnetic tape, or any other non-transitory magnetic medium, and/or the like. A non-volatile computer-readable storage medium may also include a punch card, paper tape, optical mark sheet (or any other physical medium with patterns of holes or other optically recognizable indicia), compact disc read only memory (CD-ROM), compact disc-rewritable (CD-RW), digital versatile disc (DVD), Blu-ray disc (BD), any other non-transitory optical medium, and/or the like. Such a non-volatile computer-readable storage medium may also include read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory (e.g., Serial, NAND, NOR, and/or the like), multimedia memory cards (MMC), secure digital (SD) memory cards, SmartMedia cards, CompactFlash (CF) cards, Memory Sticks, and/or the like. Further, a non-volatile computer-readable storage medium may also include conductive-bridging random access memory (CBRAM), phase-change random access memory (PRAM), ferroelectric random-access memory (FeRAM), non-volatile random-access memory (NVRAM), magnetoresistive random-access memory (MRAM), resistive random-access memory (RRAM), Silicon-Oxide-Nitride-Oxide-Silicon memory (SONOS), floating junction gate random access memory (FJG RAM), Millipede memory, racetrack memory, and/or the like.
In one embodiment, a volatile computer-readable storage medium may include random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), fast page mode dynamic random access memory (FPM DRAM), extended data-out dynamic random access memory (EDO DRAM), synchronous dynamic random access memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), double data rate type two synchronous dynamic random access memory (DDR2 SDRAM), double data rate type three synchronous dynamic random access memory (DDR3 SDRAM), Rambus dynamic random access memory (RDRAM), Twin Transistor RAM (TTRAM), Thyristor RAM (T-RAM), Zero-capacitor (Z-RAM), Rambus in-line memory module (RIMM), dual in-line memory module (DIMM), single in-line memory module (SIMM), video random access memory (VRAM), cache memory (including various levels), flash memory, register memory, and/or the like. It will be appreciated that where embodiments are described to use a computer-readable storage medium, other types of computer-readable storage media may be substituted for or used in addition to the computer-readable storage media described above.
As should be appreciated, various embodiments disclosed herein may also be implemented as methods, apparatuses, systems, computing devices, computing entities, and/or the like. As such, embodiments may take the form of an apparatus, system, computing device, computing entity, and/or the like executing instructions stored on a computer-readable storage medium to perform certain steps or operations. Thus, embodiments disclosed herein may also take the form of an entirely hardware embodiment, an entirely computer program product embodiment, and/or an embodiment that comprises combination of computer program products and hardware performing certain steps or operations.
Example embodiments are described below with reference to block diagrams and flowchart illustrations. Thus, it should be understood that each block of the block diagrams and flowchart illustrations may be implemented in the form of a computer program product, an entirely hardware embodiment, a combination of hardware and computer program products, and/or apparatuses, systems, computing devices, computing entities, and/or the like carrying out instructions, operations, steps, and similar words used interchangeably (e.g., the executable instructions, instructions for execution, program code, and/or the like) on a computer-readable storage medium for execution. For example, retrieval, loading, and execution of code may be performed sequentially such that one instruction is retrieved, loaded, and executed at a time. In some exemplary embodiments, retrieval, loading, and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together. Thus, such embodiments can produce specifically-configured machines performing the steps or operations specified in the block diagrams and flowchart illustrations. Accordingly, the block diagrams and flowchart illustrations support various combinations of embodiments for performing the specified instructions, operations, or steps.
The system architecture 100 includes a storage subsystem 120 configured to store at least a portion of the data utilized by the predictive data analysis system 110. The predictive data analysis computing entity 115 may be in communication with one or more external computing entities 105. The predictive data analysis computing entity 115 may be configured to train a prediction model based at least in part on the training data 155 stored in the storage subsystem 120, store trained prediction models as part of the model definition data store 150 stored in the storage subsystem 120, utilize trained models to generate predictions based at least in part on prediction inputs provided by an external computing entity 105, and perform prediction-based actions based at least in part on the generated predictions. The storage subsystem may be configured to store the model definition data store 150 for one or more predictive analysis models and the training data 155 uses to train one or more predictive analysis models. The predictive data analysis computing entity 115 may be configured to receive requests and/or data from external computing entities 105, process the requests and/or data to generate predictive outputs and provide the predictive outputs to the external computing entities 105. The external computing entity 105 may periodically update/provide raw input data (e.g., data objects describing an entity input data object) to the predictive data analysis system 110.
The storage subsystem 120 may be configured to store at least a portion of the data utilized by the predictive data analysis computing entity 115 to perform predictive data analysis steps/operations and tasks. The storage subsystem 120 may be configured to store at least a portion of operational data and/or operational configuration data including operational instructions and parameters utilized by the predictive data analysis computing entity 115 to perform predictive data analysis steps/operations in response to requests. The storage subsystem 120 may include one or more storage units, such as multiple distributed storage units that are connected through a computer network. Each storage unit in the storage subsystem 120 may store at least one of one or more data assets and/or one or more data about the computed properties of one or more data assets. Moreover, each storage unit in the storage subsystem 120 may include one or more non-volatile storage or memory media including but not limited to hard disks, ROM, PROM, EPROM, EEPROM, flash memory, MMCs, SD memory cards, Memory Sticks, CBRAM, PRAM, FeRAM, NVRAM, MRAM, RRAM, SONOS, FJG RAM, Millipede memory, racetrack memory, and/or the like.
The predictive data analysis computing entity 115 includes a predictive data analysis engine 130 and a contribution determination engine 135. The predictive data analysis engine 130 may be configured to perform predictive data analysis based at least in part on an entity input data object. For example, the predictive data analysis engine 130 may be configured to generate one or more entity scores corresponding to one or more entities. The contribution determination engine 135 may be configured to determine each per-candidate feature contribution score in accordance with the model definition data store 150 stored in the storage subsystem 120.
As indicated, in one embodiment, the predictive data analysis computing entity 115 may also include communications hardware 220 for communicating with various computing entities, such as by communicating data, content, information, and/or similar terms used herein interchangeably that can be transmitted, received, operated on, processed, displayed, stored, and/or the like.
The communications hardware 220 may further be configured to provide output to a user and, in some embodiments, to receive an indication of user input. In this regard, the communications hardware 220 may comprise a user interface, such as a display, and may further comprise the components that govern use of the user interface, such as a web browser, mobile application, dedicated client device, or the like. In some embodiments, the communications hardware 220 may include a keyboard, a mouse, a touch screen, touch areas, soft keys, a microphone, a speaker, and/or other input/output mechanisms. The communications hardware 220 may utilize the processing element 205 to control one or more functions of one or more of these user interface elements through software instructions (e.g., application software and/or system software, such as firmware) stored on a memory (e.g., non-volatile memory 210) accessible to the processing element 205.
As shown in
For example, the processing element 205 may be embodied as one or more complex programmable logic devices (CPLDs), microprocessors, multi-core processors, coprocessing entities, application-specific instruction-set processors (ASIPs), microcontrollers, and/or controllers. Further, the processing element 205 may be embodied as one or more other processing devices or circuitry. The term circuitry may refer to an entirely hardware embodiment or a combination of hardware and computer program products. Thus, the processing element 205 may be embodied as integrated circuits, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays (PLAs), hardware accelerators, other circuitry, and/or the like.
As will therefore be understood, the processing element 205 may be configured for a particular use or configured to execute instructions stored in volatile or non-volatile media or otherwise accessible to the processing element 205. As such, whether configured by hardware or computer program products, or by a combination thereof, the processing element 205 may be capable of performing steps or operations according to embodiments disclosed herein when configured accordingly.
In one embodiment, the predictive data analysis computing entity 115 may further include or be in communication with non-volatile media (also referred to as non-volatile storage, memory, memory storage, memory circuitry and/or similar terms used herein interchangeably). In one embodiment, the non-volatile storage or memory may include at least one non-volatile memory 210, including but not limited to hard disks, ROM, PROM, EPROM, EEPROM, flash memory, MMCs, SD memory cards, Memory Sticks, CBRAM, PRAM, FeRAM, NVRAM, MRAM, RRAM, SONOS, FJG RAM, Millipede memory, racetrack memory, and/or the like.
As will be recognized, the non-volatile storage or memory media may store databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like. The term database, database instance, database management system, and/or similar terms used herein interchangeably may refer to a collection of records or data that is stored in a computer-readable storage medium using one or more database models, such as a hierarchical database model, network model, relational model, entity-relationship model, object model, document model, semantic model, graph model, and/or the like.
In one embodiment, the predictive data analysis computing entity 115 may further include or be in communication with volatile media (also referred to as volatile storage, memory, memory storage, memory circuitry and/or similar terms used herein interchangeably). In one embodiment, the volatile storage or memory may also include at least one volatile memory 215, including but not limited to RAM, DRAM, SRAM, FPM DRAM, EDO DRAM, SDRAM, DDR SDRAM, DDR2 SDRAM, DDR3 SDRAM, RDRAM, TTRAM, T-RAM, Z-RAM, RIMM, DIMM, SIMM, VRAM, cache memory, register memory, and/or the like.
As will be recognized, the volatile storage or memory media may be used to store at least portions of the databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like being executed by, for example, the processing element 205. Thus, the databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like may be used to control certain aspects of the operation of the predictive data analysis computing entity 115 with the assistance of the processing element 205 and operating system.
As indicated, in one embodiment, the predictive data analysis computing entity 115 may also include a communications hardware 220 for communicating with various computing entities, such as by communicating data, content, information, and/or similar terms used herein interchangeably that can be transmitted, received, operated on, processed, displayed, stored, and/or the like. Such communication may be executed using a wired data transmission protocol, such as fiber distributed data interface (FDDI), digital subscriber line (DSL), Ethernet, asynchronous transfer mode (ATM), frame relay, data over cable service interface specification (DOCSIS), or any other wired transmission protocol. Similarly, the predictive data analysis computing entity 115 may be configured to communicate via wireless client communication networks using any of a variety of protocols, such as general packet radio service (GPRS), Universal Mobile Telecommunications System (UMTS), Code Division Multiple Access 2000 (CDMA2000), CDMA2000 1× (1×RTT), Wideband Code Division Multiple Access (WCDMA), Global System for Mobile Communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), Evolved Universal Terrestrial Radio Access Network (E-UTRAN), Evolution-Data Optimized (EVDO), High Speed Packet Access (HSPA), High-Speed Downlink Packet Access (HSDPA), IEEE 802.11 (Wi-Fi), Wi-Fi Direct, 802.16 (WiMAX), ultra-wideband (UWB), infrared (IR) protocols, near field communication (NFC) protocols, Wibree, Bluetooth protocols, wireless universal serial bus (USB) protocols, and/or any other wireless protocol.
Although not shown, the predictive data analysis computing entity 115 may include or be in communication with one or more input elements, such as a keyboard input, a mouse input, a touch screen/display input, motion input, movement input, audio input, pointing device input, joystick input, keypad input, and/or the like. The predictive data analysis computing entity 115 may also include or be in communication with one or more output elements (not shown), such as audio output, video output, screen/display output, motion output, movement output, and/or the like.
Via these communication standards and protocols, the external computing entity 105 can communicate with various other entities using concepts such as Unstructured Supplementary Service Data (USSD), Short Message Service (SMS), Multimedia Messaging Service (MMS), Dual-Tone Multi-Frequency Signaling (DTMF), and/or Subscriber Identity Module Dialer (SIM dialer). The external computing entity 105 can also download changes, add-ons, and updates, for instance, to its firmware, software (e.g., including executable instructions, applications, program modules), and operating system.
The external computing entity 105 may also comprise a user interface (that can include a display coupled to a processing element) and/or a user input interface (coupled to a processing element 308). For example, the user interface may be a user application, browser, user interface, and/or similar words used herein interchangeably executing on and/or accessible via the external computing entity 105 to interact with and/or cause display of information/data from the predictive data analysis computing entity 115, as described herein. The user input interface can comprise any of a number of devices or interfaces allowing the external computing entity 105 to receive data, such as a keypad (hard or soft), a touch display, voice/speech or motion interfaces, or other input device.
The external computing entity 105 can also include volatile storage or memory 322 and/or non-volatile storage or memory 324, which can be embedded and/or may be removable. The volatile and non-volatile storage or memory can store databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like to implement the functions of the external computing entity 105. As indicated, this may include a user application that is resident on the entity or accessible through a browser or other user interface for communicating with the predictive data analysis computing entity 115 and/or various other computing entities.
In another embodiment, the external computing entity 105 may include one or more components or functionality that are the same or similar to those of the predictive data analysis computing entity 115, as described in greater detail above. As will be recognized, these frameworks and descriptions are provided for exemplary purposes only and are not limiting to the various embodiments.
In various embodiments, the external computing entity 105 may be embodied as an artificial intelligence (AI) computing entity, such as an Amazon Echo, Amazon Echo Dot, Amazon Show, Google Home, and/or the like. Accordingly, the external computing entity 105 may be configured to provide and/or receive information/data from a user via an input/output mechanism, such as a display, a video capture device (e.g., camera), a speaker, a voice-activated input, and/or the like. In certain embodiments, an AI computing entity may comprise one or more predefined and executable program algorithms stored within an onboard memory storage module, and/or accessible over a network. In various embodiments, the AI computing entity may be configured to retrieve and/or execute one or more of the predefined program algorithms upon the occurrence of a predefined trigger event.
Turning to
As shown by operation 402, predictive data analysis computing entity 115 includes means, such as processing element 205, communications hardware 220, or the like, for generating an entity score for an entity. In some embodiments, the predictive data analysis computing entity 115 may receive an entity input data object for a respective entity. The entity input data object may also include a requested action. For example, the entity input data object may describe various financial feature values for a particular user (e.g., entity) applying for a credit product, and the requested action may be a prediction of whether the user will default on the credit product. The predictive data analysis computing entity 115 may be configured to process the entity input data object using a predictive analysis machine learning model to generate an entity score for the respective entity. The entity score may be based at least in part on a set of entity feature sub-scores each associated with a respective candidate feature of a plurality of candidate features (e.g., a set of candidate features). The plurality of candidate features may be features used by the predictive analysis machine learning model.
In some embodiments, the predictive analysis machine learning model may refer to an electronically-stored data construct that is configured to describe parameters, hyperparameters, and/or stored operations of a machine learning model that is configured to process an entity input data object and generate an entity score for the entity. An entity input data object may be configured to describe data values for the entity. For example, in some embodiments, an entity input data object may correspond to a credit application for an individual (e.g., an entity). As such, the entity input data object may include pertinent information for the individual such as an address, phone number, social security number, employer identification number, credit references, credit scores, income amount, employment history, debt amount, debt-to-income ratio, etc.
In some embodiments, the predictive analysis machine learning model may be a trained neural network model. In particular, in some embodiments, the predictive analysis machine learning model may be a feedforward neural network (FFNN) model or a monotone neural network (mono-NN) model. The predictive analysis machine learning model may be configured to process the data included within the entity input data object to determine an entity score for the entity. The generated entity score may be output as a vector comprising a numerical value (e.g., binary, decimal, etc.), categorical value (e.g., “approve”, “deny”, etc.), Boolean value (e.g., true, false), and/or the like. In some embodiments, the entity score may be determined based on a reference determination decision threshold, as will be described in further detail below.
At operation 404, the predictive data analysis computing entity 115 includes means, such as processing element 205, or the like, for determining whether the entity score satisfies a determination decision threshold. The predictive data analysis computing entity 115 may be configured to compare the entity score to the determination decision threshold. A determination decision threshold may be dynamically determined based at least in part on a plurality of an analysis of aggregated historical entity scores each associated with a plurality of entities, which may be stored in the storage subsystem 120. The determination decision threshold may be updated periodically, semi-periodically, or when manually requested. As such, the determination decision threshold may be updated in view of recent historical entity data, thereby allowing for increased accuracy for determination of the determination decision threshold to be used with respect to an entity score.
For example, denote p(x) as the entity score (e.g., predicted probability of default) of an entity x (e.g., representing a customer characteristic) with candidate features (x1, . . . , xK), which may be obtained by fitting a model to historical data. Then denote T as the suitably chosen determination decision threshold. A credit-decision algorithm may approve a future loan application with attribute x* if p(x*)≤τ and decline otherwise. In practice, p(x) may be developed in terms of a link function, such as ƒ(x)=logit[p(x)]. Since the model may be simpler or more interpretable in terms of ƒ(x), example embodiments may be disclosed herein in terms of ƒ(x). However, the same embodiments may be implemented in terms of p(x).
In an instance in which the entity score satisfies the determination decision threshold, the process proceeds to operation 406. At operation 406, the predictive data analysis computing entity 115 includes means, such as processing element 205, or the like, for generating a confirmation data object. The confirmation data object may indicate the entity score associated with the entity satisfies the determination decision threshold. In some embodiments, the confirmation data object may also indicate the entity score associated with the entity. The confirmation data object may be provided to one or more end users via an associated user device. By way of continuing example, an individual who has applied for a credit application, and who is approved for said application (e.g., the associated entity score satisfies the determination decision threshold), may be provided with a confirmation data object indicating that the credit application has been improved. Additionally, the confirmation data object may include supplemental information, such as details regarding next steps, account details, login information, and/or the like.
In an instance in which the entity score does not satisfy the determination decision threshold, the process proceeds to operation 408. In such a situation, by way of continuing example, an individual who has applied for a credit application may be denied for said application (e.g., the associated entity score fails to satisfy the determination threshold). As such, it may be necessary to provide the user (e.g., the individual who applied for a credit application) with an indication of why he/she was denied and optionally, provide other users (e.g., employees associated with the system) with this indication as well. As described above, due to the complexities associated with machine learning models, it may be difficult to ascertain the reason for such a denial. As such, the operations described in operations 408-418 may improve the interpretability of the predictive analysis machine learning model that was used to generate the entity score that, ultimately, led to the outcome of the denial of the credit application. As such, the subsequent operations described below may allow the individual, credit loan officers, regulatory body employees, and the like to understand the reason for the denial.
At operation 408, the predictive data analysis computing entity 115 includes means, such as processing element 205, or the like, for selecting a reference entity. The predictive data analysis computing entity 115 may select the reference entity from a plurality of candidate reference entities, such as those stored in the storage subsystem 120. The reference entity may be selected based at least in part on a set of reference feature sub-scores each associated with a respective candidate feature of a plurality of candidate features. The plurality of candidate features associated with the reference entity may be the same candidate features associated with the entity.
The predictive data analysis computing entity 115 may use the predictive analysis machine learning model to select a reference entity from a plurality of the candidate reference entities based on whether the one or more candidate reference entity scores satisfy one or more reference determination decision thresholds. The one or more dynamically customizable reference determination decision thresholds may be dynamically determined based on one or more rules enabled by a user, institution, regulatory body, etc. specifications. For example, in some embodiments, it may be advantageous to select a reference determination decision threshold such that a reference entity associated with a reference entity score is near a maximum reference entity score (e.g., a greatest reference entity score). Alternatively, in some embodiments, it may be advantageous to select a reference determination decision threshold such that a reference entity score near the value of the determination decision threshold is selected. In yet another embodiment, it may be advantageous to select a reference determination decision threshold near the value of the determination decision threshold but with an additional buffer score (e.g., a reference entity score 10%-15% above the determination decision threshold value).
For example, the predictive data analysis computing entity 115 may compare the entity (e.g., loan characteristic) xD with reference entity xA. In various examples, the reference entity xA corresponds to a loan application that has an entity score satisfying the determination decision threshold (e.g., the loan would be approved).
For the following examples, suppose xD represents an entity that does not satisfy a determination decision threshold (which may happen for a declined loan application). Several example methods are disclosed below for selecting a reference feature sub-score that lies within the accept region 604. The chosen method for making this selection may depend on a number of practical considerations. For a first example, reference entity xA may be chosen at high values of the feature sub-scores in the accept region 604, such as values close to a maximum. For a second example, xA may be chosen as the point with the shortest distance from xD to the determination decision threshold boundary (“boundary”). This choice may provide information on the smallest changes needed to move the entity across the boundary to satisfy the determination decision threshold. However, in this example, the determination decision threshold boundary may be estimated from data, and may be subject to inherent variability, and the reference feature may vary with the individual entity and thus may be difficult to explain small variations to multiple entities. For a third example, reference features may be considered that have the shortest distance in a lower-dimensional subspace of the candidate features (e.g., features x1, . . . xK). The contribution determination engine 135 may select the lower-dimensional subspace based on candidate features that are pre-determined as relevant. The lower-dimensional subspace may have fewer dimensions than the original feature space in which the entity is embedded (e.g. feature space 600). For example, a customer may be interested in a subset of candidate features (xa1, . . . xaN) deemed relevant because the customer may have more direct input to modify the candidate features belonging to the subset. By determining the candidate attribute using the lower-dimensional space defined by the subset of features, a determination may be made that relies only on the features of interest to the customer.
At operation 410, the predictive data analysis computing entity 115 may include means, such as processing element 205, or the like, for determining pairwise feature correlation score for each pair of candidate features. In some embodiments, the predictive data analysis computing entity 115 may receive candidate features that are highly or moderately correlated, which may occur especially for models including a large number of candidate features. The predictive data analysis computing entity 115 may detect that model values of extrapolation entity scores (described in further detail below) may lie outside the envelope where the model makes reliable predictions. In such instances, the correlated candidate features may be treated jointly by deriving new candidate features, where a new candidate feature may be a function of two of the highly correlated candidate features.
At operation 412, the predictive data analysis computing entity 115 includes means, such as processing element 205, or the like, for determining a per-candidate feature contribution score for each candidate feature. In some embodiments, the predictive analysis machine learning model may be configured to process an entity input data object and generate a per-candidate feature contribution score for each candidate feature. The predictive analysis machine learning model may decompose the model (e.g., weighted linear function) used to generate the entity score to generate each per-candidate feature contribution score for each candidate feature. The generated per-candidate feature contribution scores may be output as a vector comprising each per-candidate feature contribution score. Each position in the vector may correspond to a respective candidate feature.
The predictive data analysis computing entity 115 may use the predictive analysis machine learning model to determine each per-candidate contribution score for each candidate feature. The predictive analysis machine learning model may be configured to use Baseline Shapley techniques (e.g., Shapley decomposition) to generate the per-candidate contribution score for each candidate feature. In some embodiments, the predictive analysis machine learning model may be configured to decompose the difference of the model (e.g., weighted linear combination) used to generate the entity score and reference entity to generate a per-candidate contribution score.
Turning to
ƒ(x)=b0+b1x1+ . . . +bKxk.
The model may be decomposed into per-candidate feature contribution scores for a given reference entity xA according to:
where Ek(xD, xA) is the per-candidate feature contribution score of the kth candidate feature to the model. The decomposition of the example linear model is straightforward, with Ek=bk(xkD−xkA) for k=1, . . . , K. For simplification purposes, Ek(xD, xA) may be denoted simply as Ek.
As another example, a two-factor linear model with even simple interactions actions may be represented in the form:
ƒ(x)=b0+b1x1+b2x2+b12x1x2.
The model may then be decomposed into per-candidate feature contribution score according to:
[ƒ(xD)−ƒ(xA)]b1(x1D−x1A)+b2(x2D−x2A)+b1,2(x1Dx2D−x2Ax2A).
The last term on the right-hand side of involves both x1 and x2 and thus, must be further decomposed and allocated to each separately.
To further appreciate the above two-factor linear model, a general two-factor model with two candidate features is detailed. Consider a function:
ƒ(x)=ƒ(x1,x2).
The model may be decomposed into per-candidate feature contribution scores for a given reference entity xA according to:
Here, E11 may be defined as [ƒ(x1D,x2D)−ƒ(x1A,x2D)], E12 may be defined as [ƒ(x1D,x2A)−ƒ(x1A,x2A)], E21 may be defined as [f(x1D,x2D)−ƒ(x1D,x2A)], and E22 may be defined as [ƒ(x1A,x2D)−ƒ(x1A,x2A)].
Further, E11 may measure the difference when x1 changes from its level at the declined point 706 to its level at the declined point 712, with x2 fixed at the declined point. Similarly, E12 may measure the difference with x2 fixed at the declined point. Note that computing the per-candidate feature contribution scores only requires computing function values at the four corners depicted in
As such, it may the per-candidate feature contribution score for the x1 and x2 candidate features to the model may be the average of the two values as:
Returning now to the two-factor linear model with simple interactions actions (e.g., ƒ(x)=b0+b1x1+b2x2+b12x1x2), by defining E1 and E2 as shown above, the interaction term may be expressed as:
This may be further simplified to yield
Thus, the two-factor linear model may have the per-candidate feature contribution scores as:
The above results may be generalized to consider K candidate features which allows a model with K candidate features to be expressed as:
Thus, a per-candidate feature contribution score of the kth candidate feature Ek may be decomposed as follows:
To compute the decompositions, a total of (K choose 2)*4 function evaluations are required. In particular K choose 2 sub-models of two-factors are performed and each require four evaluations.
At operation 504, the predictive data analysis computing entity 115 includes means, such as processing element 205, or the like, for evaluating, using a Baseline-Shapley decomposition function, the per-candidate feature contribution score based on the set of extrapolation feature sub-scores, and entity feature sub-scores, and reference feature sub-scores. The extrapolation scores may be identified with the extrapolation point 708 and extrapolation point 710 in feature space 700 from
In some embodiments, the predictive analysis machine learning model may be configured to determine a pairwise feature correlation score for each pair of candidate features. A pair of candidate features may include two or more candidate features that are selected from the set of candidate features. In an instance the pairwise feature correlation score for a pair of candidate features satisfies one or more feature correlation thresholds, the predictive analysis machine learning model may determine the per-candidate feature contribution score for the candidate features together. For example, if candidate features 1 and 5, candidate features 1 and 7, and candidate features 5 and 7 are determined to each have a pairwise feature correlation score which satisfies the one or more feature correlation thresholds, candidate features 1, 5, and 7 may be considered together when determining the per-candidate feature contribution score.
The previous example of
for k=1, . . . , K. The summation is performed over all possible subsets, S, and the combinatorial coefficients arise from the number of such subsets.
The Shapley decomposition may be applied to the problem of decomposing the fitted model prediction ƒ(x*) into contributions of the K variables by specifying a particular value function. Applying B-Shap in particular, with Sk=S\{k}, the subset of S without {k}, a reference feature sub-score for the kth candidate feature may be written as:
The B-Shap decomposition involves only function evaluations (e.g., no integration is needed) so it is computationally more efficient than alternative models. A model with K candidate features may involve at most 2K function evaluations.
Returning to
At operation 416, the predictive data analysis computing entity 115 includes means, such as processing element 205, or the like, for generating a predictive contribution report. In some embodiments, the predictive contribution report is configured to describe each candidate feature which does not satisfy one or more contribution thresholds. In some embodiments, a contribution threshold may be an absolute numerical value to which each per-candidate feature contribution score may be compared. For example, a contribution threshold of 0.600 may be indicative to include candidate features associated with the per-candidate feature contribution scores above 0.600. In some embodiments, a contribution threshold may be a percentage value to which each per-candidate feature contribution score may be compared. For example, a contribution threshold of 15% may be indicative to include the candidate features associated with the at least 15% of the sum of the value of aggregated per-candidate feature contribution scores. In some embodiments, a contribution threshold may be a value indicative of a count of candidate features to select. For example, a contribution threshold of 5 may be indicative to include the top 5 candidate features associated with the relatively largest per-candidate feature contribution scores as compared to the other per-candidate feature contribution scores associated with the remaining candidate features.
In some embodiments, the predictive contribution report may also include one or more recommendations for the entity that may improve the overall entity score based on each per-feature contribution score. For example, if a candidate feature associated with a ‘percent card utilization’ feature is determined to correspond to a per-candidate feature contribution score which does not satisfy the one or more contribution thresholds, then a recommendation may describe a predictive action of “decrease utilization of current credit usage”. As such, the entity may be provided with recommendations of predictive actions which may improve the associated entity score.
Optionally, at operation 416, the predictive data analysis computing entity 115 includes means, such as processing element 205, communications hardware 220, or the like, for generating a preliminary risk category for the entity described by the entity input data object. In particular, the predictive data analysis computing entity 115 may be configured to generate a preliminary risk category for the entity based on the overall model response. A preliminary risk category may be indicative of an inferred risk associated with performing the requested action for the entity. A preliminary risk category may include a high-risk preliminary category, a medium-risk preliminary category, and a low-risk preliminary category, for example. By way of continuing example, the overall model response for the portfolio may be an increase in predicted value of the stock and therefore, a preliminary risk category for the portfolio may be determined to be a low preliminary risk category. As another example, an overall model response for the portfolio may be a decrease in predicted value of the stock and therefore, a preliminary risk category for the portfolio may be determined to be a high preliminary risk category.
Optionally, at operation 418, the predictive data analysis computing entity 115 includes means, such as processing element 205, communications hardware 220, or the like, for generating a real-time notification processing output based on the preliminary risk category generated for the entity. In particular, each preliminary risk category may be associated with a particular set of notification processing outputs which the predictive data analysis computing entity 115 may generate. The predictive data analysis computing entity 115 may then generate the set of notification processing outputs and provide the notification processing outputs to one or more user devices, such as a user device associated with the user, a financial institution employee, or the like and may do so in substantially real-time. The real-time notification processing output may include the predictive temporal feature impact report, including the overall model response, one or more attention header scores, one or more per-temporal feature time impact scores over each time window, one or more temporal feature sets, comparisons between one or more scores, and/or the like.
By way of continuing example, a low preliminary risk category may be associated with a set of registration processing outputs which are configured to output an explanation that a low preliminary risk category is associated with the stocks of the portfolio and further, that the value of the stocks are predicted to increase over the next 3 milliseconds. In some embodiments, the notification processing output may further be configured to execute one or more additional actions, such as buying additional stocks. As such, the notification processing output may provide the explanation of that the portfolio is low risk as well as the data included in the predictive temporal feature impact report and execute one or more purchases of stocks for the customer. The purchased stock may be selected based on user configuration settings, trading history, market rates, via the use of other models, and/or the like. The notification processing output may further be generated and/or updated to include the stock that was purchased. As such, the one or more end users may receive the real-time notification processing output and may obtain an up-to-date and accurate picture of the current state of their portfolio (e.g., that the value is increasing) and may further allow the predictive data analysis computing entity 115 to take additional actions in substantially real-time based on the up-to-date model response and preliminary risk category.
As another example, a high preliminary risk category may be associated with a set of registration processing outputs which are configured to output an explanation that a high preliminary risk category is associated with the stocks of the portfolio and further, that the value of the stocks are predicted to decrease over the next 3 milliseconds. Because a high preliminary risk category was determined, the predictive data analysis computing entity 115 may determine to not buy any additional stock. As such, the notification processing output may provide the explanation of that the portfolio is high risk as well as the data included in the predictive temporal feature impact report and may also indicate that no additional stocks were purchased. As such, the one or more end users may receive the real-time notification processing output and may obtain an up-to-date and accurate picture of the current state of their portfolio (e.g., that the value is decreasing) and may be informed that no additional actions were performed due to the up-to-date model response and preliminary risk category. Additionally, the one or more end users may view the top contributing features as to why their portfolio is decreasing and thus, may be better informed as to that particular model response was determined, thereby improving model interpretability.
The flowchart blocks support combinations of means for performing the specified functions and combinations of operations for performing the specified functions. It will be understood that individual flowchart blocks, and/or combinations of flowchart blocks, can be implemented by special purpose hardware-based computing devices which perform the specified functions, or combinations of special purpose hardware and software instructions.
In some embodiments, some of the operations above may be modified or further amplified. Furthermore, in some embodiments, additional optional operations may be included. Modifications, amplifications, or additions to the operations above may be performed in any order and in any combination.
An illustrative example implementation is provided to demonstrate the results of some embodiments disclosed herein. Historical information on 50,000 customers was simulated from a financial institution for a comparable credit product. The response was binary with y equal to 1 if the account defaulted within an 18-month period and y equal to 0 otherwise. Subject matter expertise suggested that the relationship between an entity score and six of the ten candidate features should be monotone.
Two different predictive analysis machine learning models were used to fit the above data. A first predictive analysis machine learning model employed an unconstrained feedforward neural network and a second predictive analysis machine learning model employed a monotone neural network that incorporates the shape constraints shown in table 1 to generate entity scores.
The data set of 50,000 observation was divided as follows: 80% training, 10% validation, and 10% testing. The hyperparameters of the first predictive analysis machine learning model with a FFNN algorithm was tuned and ended up with three layers and nodes and a learning rate of 0.004. The second predictive analysis machine learning model with the mono-NN had three layers with nodes and a learning rate of 0.001.
Table 2 below shows the predictive performances for the two models. The second predictive analysis machine learning model with the mono-NN has a lower training area under the receiver operating characteristic curve (AUC) but higher test AUC, indicating it generalizes to the test dataset better. It also exhibited a smaller gap between training and test AUCs, suggesting it may be more robust.
It will be noted that x10 is least important. This may be due to a high correlation between x9 and x10. Similarly, x6, x7, and x8 may also be highly correlated such that the effects are distributed. To address possible interpretation problems from high levels of correlation, the 10 candidate features may be collapsed to get five candidate features that measure intrinsically different quantities. The right plot in
Another illustrative example implementation is provided to demonstrate the results of some embodiments disclosed herein. The simulated dataset described in table 1 that mimics applications for credit cards is analyzed. The candidate features and their marginal distributions are obtained from credit bureaus data. Their correlations as well as input-output model are simulated, but mimic real-world behavior. An adverse action explanation is provided based on the predictive contribution report generated by an example method. In this example, the determination decision threshold is set to τ=0.25. The reference feature score xA is selected as the 75th percentile of each candidate feature, shown in Table 3 below. The corresponding reference entity score is p(xA)=0.016. Two entity feature scores are selected in the declined region: x1D and x2D with p(x1D)=0.294 and p(x2D)=0.858. The per-candidate feature contribution scores may be positive or negative for monotone increasing or decreasing variables.
At the fourth column of Table 3, there is no difference in the values of x6, x7, and x8 between xA and x1D, so they do not contribute, as reflected in their corresponding per-candidate feature contribution scores. The values of x3 and x5 are not very different, so the contributions are correspondingly small. The values of x1 are quite different for the entity (declined) and reference entity (accepted) points, but the contributions are relatively small due to the lesser importance of these features to the model. On the other hand, the values of x2 in the first and third columns are quite different, and the feature is important to the model, so the corresponding per-candidate feature contribution score is large, amounting to roughly 60%.
In some embodiments, certain highly correlated candidate features may be combined in accordance with some example embodiments described previously. For example, the candidate features in the previous example show many such large correlations, and may be combined into the set of joint candidate features depicted below in Table 4. These five joint candidate features are more interpretable in terms of measuring distinct underlying measures of creditworthiness, and cause results that are easier to explain.
As described above, the example embodiments provide systems, methods, and apparatuses which enable improved interpretability of determinations, evaluations, outcomes, or the like where machine learning is used. As such, improved clarity of the various factors that contributed to an outcome and the respective impact said factor had on the outcome may be discerned and provided to one or more end users to enable improved visibility, interpretability, and clarity of the machine learning model. As these measures and metrics may allow end users, such as applicants for a credit application, lenders, government regulatory body employees, etc. to learn what factors led to an application denial and the impact of each factor. Thus, credit applicants may take corrective measures to improve in these particular areas and improve the likelihood of a credit application approval on a next attempt. Additionally, the interpretability and insight provided by the predictive analysis machine learning model may conform with regulatory and/or government requirements such that the predictive analysis machine learning model may be used solely or in tandem with manual review to process credit applications.
In particular, the predictive analysis machine learning model configured to generate an entity score as well as a predictive contribution report based on each per-candidate feature contribution score for each candidate feature for an entity. Each candidate feature may be descriptive of a considered parameter used in the predictive analysis machine learning model and its impact to the entity score. Thus, the predictive analysis machine learning model may provide for an accurate entity score determination while also providing for interpretability of the impact of each candidate feature considered by said model.
Various embodiments disclosed herein also address technical challenges for efficient per-candidate feature contribution score determinations in real-time by introducing techniques that enable utilizing an existing reference entity selected based on one or more dynamically customizable reference determination decision thresholds, in part to determine the per-candidate feature contribution score for each candidate feature. By using an existing reference entity, with a reference entity score which may satisfy the one or more reference determination decision thresholds, the predictive analysis machine learning model may reduce the computational complexity of runtime operations that may be associated with processing of a non-existent reference entity for use by the predictive analysis machine learning model. Additionally, the one or more dynamically customizable reference determination decision thresholds may be customized to user, institution, regulatory body, etc. specifications, thereby allowing for controllability with respect to determination of each per-candidate feature contribution score. For example, in some embodiments, it may be advantageous to select a reference determination decision threshold such that a reference entity with reference entity score near a maximum reference entity score is selected. Alternatively, in some embodiments, it may be advantageous to select a reference determination decision threshold such that a reference entity with reference entity score near the value of the determination decision threshold is selected. In yet another embodiment, it may be advantageous to select a reference determination decision threshold near the value of the determination decision threshold but with an additional buffer score (e.g., a reference entity score 10%-15% above the determination decision threshold value) is selected.
Many modifications and other embodiments of the innovations set forth herein will come to mind to one skilled in the art to which these innovations pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the innovations are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
The present application claims the benefit of U.S. Provisional Application No. 63/367,701, filed Jul. 5, 2022, which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63367701 | Jul 2022 | US |