This disclosure relates to customizing authorization request schedules and, more particularly, to customizing with machine learning models the number and timing of authorization requests of an authorization request schedule for use in obtaining authorization from a credential manager subsystem of a transaction between a service provider subsystem and a customer.
A payment for funding a service transaction between a customer and a service provider (e.g., for any suitable goods/services of an online media store) using a transaction credential managed by a credential manager (e.g., a payment instrument of a financial institution) may be scheduled for authorization at a particular scheduled authorization time, and the service provider may send an authorization request for the service transaction to the credential manager at that particular scheduled authorization time. However, oftentimes, such an authorization request may fail (e.g., due to downtime of the credential manager or expiration of the transaction credential) and one or more additional authorization requests may then be made.
This document describes systems, methods, and computer-readable media for customizing authorization request schedules.
As an example, a method for customizing, using a management server, an authorization request schedule afforded to a customer in a transaction for a product between a service provider and the customer using a payment credential is provided that may include detecting, with the management server, a transaction due date for a payment in the transaction using the payment credential, identifying, with the management server, a plurality of potential request times associated with the detected transaction due date, for each potential request time of the identified plurality of potential request times, running, with the management server, a trained authorization request approval probability model on transaction feature data associated with the transaction to predict a probability of an authorization request approval of the payment for that potential request time, selecting, with the management server, a particular potential request time of the identified plurality of potential request times, calculating, with the management server, an expected gain for the selected particular potential request time using the predicted probability of an authorization request approval of the payment for the selected particular potential request time, comparing, with the management server, the calculated expected gain for the selected particular potential request time to a comparator, and determining, with the management server, whether or not to add the selected particular potential request time to the authorization request schedule based on the comparing.
As another example, a system for customizing an authorization request schedule is provided that may include a credential manager subsystem that manages a payment credential of a customer, and a service provider subsystem that offers a product to the customer, wherein the service provider subsystem may be configured to detect a due date for a payment using the payment credential in a transaction for the product between the service provider subsystem and the customer, in response to the detection of the due date, identify a plurality of potential request times associated with the due date, for each potential request time of the identified plurality of potential request times: predict, using a trained probability model, a probability of an authorization request approval of the payment if made at that potential request time, calculate an expected gain for the that potential request time using the predicted probability of an authorization request approval of the payment if made at that potential request time, and selectively update the authorization request schedule with that potential request time based on the calculated expected gain for that potential request time, and, at a first potential request time of the updated authorization request schedule, request, of the credential manager subsystem, an authorization of the payment using the payment credential.
As yet another example, a non-transitory machine readable medium storing a program for execution by at least one processing unit of a management server is provided, the program for customizing an authorization request schedule, the program including sets of instructions for predicting, using a trained probability model, a probability of an authorization request approval of a payment if made at a potential request time, calculating, using a statistical model, an expected gain for the potential request time based on the predicted probability, adding the potential request time to the authorization request schedule based on the calculated expected gain, and, after the adding, at the potential request time of the authorization request schedule, attempting to obtain approval of an authorization request of the payment.
This Summary is provided only to present some example embodiments, so as to provide a basic understanding of some aspects of the subject matter described in this document. Accordingly, it will be appreciated that the features described in this Summary are only examples and should not be construed to narrow the scope or spirit of the subject matter described herein in any way. Unless otherwise stated, features described in the context of one example may be combined or used with features described in the context of one or more other examples. Other features, aspects, and advantages of the subject matter described herein will become apparent from the following Detailed Description, Figures, and Claims.
The discussion below makes reference to the following drawings, in which like reference characters refer to like parts throughout, and in which:
Systems, methods, and computer-readable media may be provided for customizing authorization request schedules. A payment for funding a service transaction between a customer and a service provider (e.g., for any suitable goods/services of an online media store) using a transaction credential managed by a credential manager (e.g., a payment instrument of a financial institution) may be initially scheduled for authorization at a particular scheduled authorization time (e.g., 30 days or 365 days after an initiation event, such as an initial subscription to a monthly or annual subscription service or any suitable installment plan or upgrade schedule or the like that may result in a payment schedule). However, an authorization request schedule may be defined in an optimized manner to guide when the service provider might attempt to request authorization from the credential manager for funding the service transaction in a customized manner based on actual features of the customer and/or the transaction and/or the credential manager and/or the like. A goal may be to customize the number of scheduled authorization requests and/or the scheduled time of each scheduled authorization request of a customized authorization request schedule afforded by the service provider to a particular customer for a particular transaction using a particular transaction credential, such that the customized schedules for all transactions may be operative to maximize the number of successful authorization requests attempted by the system for all transactions (e.g., to increase efficiency) and/or to minimize the total number of authorization requests attempted by the system for all transactions (e.g., to reduce transaction fees and/or operation traffic costs that may be associated with requesting transaction authorization). An authorization request approval probability may be identified using one or more authorization request approval probability models and a particular set of model input features based on particular transaction data associated with a particular customer and a particular transaction and a particular transaction credential, such that the scheduling of an authorization request schedule for that particular transaction may be personalized to the particular customer and/or different types of authorization request schedules may be afforded to different customers and/or different authorization request schedules may be afforded to a particular customer over time based on that particular customer's behavior. In some embodiments, certain characteristics of a scheduled authorization request schedule may be dynamically changed during the pendency of that authorization request schedule based on new real-time data that may be made available to the system. Use of one or more machine learning models may enable a data driven model for customizing an authorization request schedule as opposed to a pre-defined authorization request schedule, a dynamically refreshable active authorization request schedule as opposed to a static authorization request schedule, and/or a probability based authorization request schedule as opposed to only a rule based authorization request schedule. In some embodiments, application of one or more statistical or rule-based models may be combined with the use of one or more learning-based models to provide a hybrid model approach that may allow for the personalization and flexibility of a learning model as well as the fine tunability of a statistical model.
A transaction authorization request may be declined by CM subsystem 300 or otherwise unsuccessful for any suitable reason, such as CM subsystem 300 being temporarily offline, the customer transaction credential being temporarily suspended, the customer transaction credential being invalid (e.g., due to the expiration date of the credential having been passed), and/or the like. Moreover, each transaction authorization request made by SP subsystem 200 may have any suitable cost associated therewith, including, but not limited to, an operational cost, a transactional cost, and/or the like, to which a particular monetary value may be calculated or used to represent the cost to SP subsystem 200 of a transaction authorization request. Therefore, a goal may be to customize any suitable aspects of an authorization request schedule (e.g., the number and/or the timing of authorization requests of an authorization request schedule) afforded by the SP subsystem to a customer for minimizing the number of authorization requests and/or for maximizing the number of successful authorization requests and/or otherwise improving or optimizing or maximizing the success rate of each authorization request of the schedule (e.g., for reducing costs associated with carrying out an authorization request).
Various types of data may be used to determine the probability of success of an authorization request to be made at each one of various particular times for a particular transaction. At least one suitable authorization request approval probability (“ARAP”) model, such as any suitable machine learning model (e.g., a binary classification model, a multi-class classification model, a regression model, a random forest model, a gradient boosted tree model, an ensemble model, a neural network (e.g., a deep or wide or deep and wide neural network), a learning engine, models that are non-machine learning models or non-statistical learning based models, where such engines may include policy- or operations-based rules, third party learning application programming interfaces (“APIs”), etc., and/or the like), may be trained and utilized in conjunction with any suitable transaction data for customizing an authorization request schedule to be utilized by SP subsystem 200. For example, as described with respect to
In some embodiments, certain characteristics of an authorization request schedule may be dynamically changed during the pendency of that authorization request schedule based on new real-time data that may be made available to the system. For example, an authorization request schedule may be provided with a dynamic number and/or timing of scheduled requests by re-training and/or re-inferring any suitable ARAP model during the authorization request schedule using any new data that may be made available to the system during that authorization request schedule. As just one example, while an authorization request schedule may be active for a particular customer's transaction (e.g., as may have been determined by one or more ARAP models at a first moment in time using first transaction data for the particular customer), in response to negative results of one or more of the scheduled requests of the schedule and/or in response to the particular customer attempting to make a new purchase, such negative results and/or such a new purchase may provide new transaction data that may be used by the system for potentially dynamically updating one or more characteristics of the authorization request schedule (e.g., new BF, new TF, and/or new UF may be provided as new second transaction data for the particular customer that may be provided as new input data into one or more ARAP models for potentially defining a new or updated remainder of the current authorization request schedule (i.e., after the schedule may have been initially defined at the first moment in time)). The use of one or more ARAP models may enable such dynamic updating of one or more characteristics of an authorization request schedule that may be of a limited duration. Such models (e.g., neural networks) running on any suitable processing units (e.g., graphical processing units (“GPUs”) that may be available to SP subsystem 200) provide significant speed improvements in efficiency and accuracy with respect to prediction over other types of algorithms and human-conducted analysis of data, as such models can provide estimates in a few milliseconds or less, thereby improving the functionality of any computing device on which they may be run. Due to such efficiency and accuracy, such models enable a technical solution for enabling in-schedule dynamic updating of schedule characteristics (e.g., number and/or timing of requests) using any suitable real-time data (e.g., data made available to the models during the schedule) that may not be possible without the use of such models, as such models may increase performance of their computing device(s) by requiring less memory, providing faster response times, and/or increased accuracy and/or reliability. Due to the condensed time frame of an authorization request schedule and/or the time within which a decision with respect to a characteristic of an authorization request schedule ought to be made to provide a desirable customer experience, such models offer the unique ability to provide accurate determinations with the speed necessary to enable accurate decisions for initially defining a schedule and/or dynamic adjustments to an active schedule. Such models may enable a data-driven model for customizing an authorization request schedule as opposed to a pre-defined authorization request schedule, a dynamically refreshable active authorization request schedule as opposed to a static authorization request schedule, and/or a utility probability based authorization request schedule as opposed to a purely rule-based authorization request schedule.
CM subsystem 300 may be provided by any suitable credential manager that may manage any funding account on behalf of a customer and/or that may provide a customer with any suitable customer transaction credential that may be used to identify to a service provider an associated funding account from which funds may be transferred to an account of the service provider in exchange for any suitable goods and/or services of the service provider. CM subsystem 300 may include a payment network subsystem (e.g., a payment card association or a credit card association) and/or an issuing bank subsystem and/or any other suitable type of subsystem. A specific customer transaction credential that may be used during a service transaction with SP subsystem 200 may be associated with a specific funding account of a particular user with CM subsystem 300 (e.g., accounts for various types of payment cards may include credit cards, debit cards, charge cards, stored-value cards (e.g., transit cards), fleet cards, gift cards, and the like). Such a customer transaction credential may be provisioned on device 100 (e.g., as CM credential information of an applet on a secure credential component (e.g., NFC component, secure element, and/or the like) of device 100) by CM subsystem 300 and may later be used by device 100 as at least a portion of a service transaction order communicated to SP subsystem 200 for funding a transaction between a customer and SP subsystem 200 (e.g., to pay for a good or service of the SP of SP subsystem 200). Alternatively, such a customer transaction credential may be provided to a customer as a physical credential card or any suitable credential information that may be relayed by the customer to an SP (e.g., over the telephone or via manual entry into a web form), which may be used by the SP for funding a service transaction.
SP subsystem 200 may be provided by any suitable service provider that may utilize customer transaction credential data to facilitate a service transaction for providing any suitable goods and/or services to a customer or another entity or device of the customer's choosing. As just one example, SP subsystem 200 may be provided by Apple Inc. of Cupertino, Calif., which may be a provider of various services to users of device 100 (e.g., the iTunes™ Store for selling/renting media to be played by device 100, the Apple App Store™ for selling/renting applications for use on device 100, the Apple Music™ Service for providing a subscription streaming service, the Apple iCloud™ Service for storing data from device 100 and/or associating multiple user devices and/or multiple user profiles with one another, the Apple Online Store for buying various Apple products online, etc.), and which may also be a provider, manufacturer, and/or developer of device 100 itself (e.g., when device 100 is an iPod™, iPad™, iPhone™, or the like) and/or of an operating system (e.g., device application 103) of device 100. As another example, SP subsystem 200 may be provided by a restaurant or a movie theater or an airline or a car dealership or any other suitable SP entity. The SP that may provide SP subsystem 200 (e.g., Apple Inc.) may be distinct and independent from any CM of any remote CM subsystem 300 (e.g., any funding entity of any remote funding subsystem, any financial institution entity of any remote financial institution subsystem, etc.). For example, the SP that may provide SP subsystem 200 may be distinct and/or independent from any payment network or issuing bank that may furnish and/or manage any credit card or any other customer transaction credential and/or customer payment credential and/or customer funding credential to be used by a customer for funding a service transaction with SP subsystem 200.
Communication of any suitable data between electronic device 100 and CM subsystem 300 may be enabled via any suitable communications set-up 95, which may include any suitable wired communications path, wireless communications path, or combination of two or more wired and/or wireless communications paths using any suitable communications protocol(s) and/or any suitable network and/or cloud architecture(s). Additionally or alternatively, communication of any suitable data between SP subsystem 200 and CM subsystem 300 may be enabled via any suitable communications set-up 95. Additionally or alternatively, communication of any suitable data between electronic device 100 and SP subsystem 200 that may not be made via CM subsystem 300 may be enabled via any suitable communications set-up 95. Communications set-up 95 may be at least partially managed by one or more trusted service managers (“TSMs”). Any suitable circuitry, device, system, or combination of these (e.g., a wired and/or wireless communications infrastructure that may include one or more communications towers, telecommunications servers, or the like) that may be operative to create a communications network may be used to provide at least a portion of communications set-up 95, which may be capable of providing communications using any suitable wired or wireless communications protocol. For example, communications set-up 95 may support Wi-Fi (e.g., an 802.11 protocol), ZigBee (e.g., an 802.15.4 protocol), WiDi™, Ethernet, Bluetooth™, BLE, high frequency systems (e.g., 900 MHz, 2.4 GHz, and 5.6 GHz communication systems), infrared, TCP/IP, SCTP, DHCP, HTTP, BitTorrent™, FTP, RTP, RTSP, RTCP, RAOP, RDTP, UDP, SSH, WDS-bridging, any communications protocol that may be used by wireless and cellular telephones and personal e-mail devices (e.g., GSM, GSM plus EDGE, CDMA, OFDMA, HSPA, multi-band, etc.), any communications protocol that may be used by a low power Wireless Personal Area Network (“6LoWPAN”) module, any other communications protocol, or any combination thereof.
As shown in
As shown in
Memory 104 may include one or more storage mediums, including, for example, a hard-drive, flash memory, permanent memory such as read-only memory (“ROM”), semi-permanent memory such as random access memory (“RAM”), any other suitable type of storage component, or any combination thereof. Memory 104 may include cache memory, which may be one or more different types of memory used for temporarily storing data for electronic device applications. Memory 104 may be fixedly embedded within electronic device 100 or may be incorporated onto one or more suitable types of cards that may be repeatedly inserted into and removed from electronic device 100 (e.g., a subscriber identity module (“SIM”) card or secure digital (“SD”) memory card). Memory 104 may store media data (e.g., music and image files), software (e.g., for implementing functions on device 100), firmware, media information (e.g., media content and/or associated metadata), preference information (e.g., media playback preferences), lifestyle information (e.g., food preferences), exercise information (e.g., information obtained by exercise monitoring equipment or any suitable sensor circuitry), transaction information (e.g., information such as credit card information or other transaction credential information), wireless connection information (e.g., information that may enable device 100 to establish a wireless connection), subscription information (e.g., information that keeps track of podcasts or television shows or other media a user subscribes to), contact information (e.g., telephone numbers and e-mail addresses), calendar information, pass information (e.g., transportation boarding passes, event tickets, coupons, store cards or other transaction credentials (e.g., financial payment cards), etc.), any suitable model data of device 100 (e.g., as may be stored in any suitable device model 105 of memory assembly 104 (e.g., any portion or all of one, some, or each model of SP subsystem 200 or otherwise as may be described herein)), any other suitable data, or any combination thereof.
Power supply circuitry 106 can include any suitable circuitry for receiving and/or generating power, and for providing such power to one or more of the other components of electronic device 100. For example, power supply circuitry 106 can be coupled to a power grid (e.g., when device 100 is not acting as a portable device or when a battery of the device is being charged at an electrical outlet with power generated by an electrical power plant). As another example, power supply circuitry 106 can be configured to generate power from a natural source (e.g., solar power using solar cells). As another example, power supply circuitry 106 can include one or more batteries for providing power (e.g., when device 100 is acting as a portable device). For example, power supply circuitry 106 can include one or more of a battery (e.g., a gel, nickel metal hydride, nickel cadmium, nickel hydrogen, lead acid, or lithium-ion battery), an uninterruptible or continuous power supply (“UPS” or “CPS”), and circuitry for processing power received from a power generation source (e.g., power generated by an electrical power plant and delivered to the user via an electrical socket or otherwise).
One or more input components 108 may be provided to permit a user to interact or interface with device 100. For example, input component circuitry 108 can take a variety of forms, including, but not limited to, a touch pad, dial, click wheel, scroll wheel, touch screen, one or more buttons (e.g., a keyboard), mouse, joy stick, track ball, microphone, still image camera, video camera, scanner (e.g., a bar code scanner or any other suitable scanner that may obtain product identifying information from a code, such as a bar code, or the like), proximity sensor, light detector, biometric sensor (e.g., a fingerprint reader or other feature recognition sensor, which may operate in conjunction with a feature-processing application that may be accessible to electronic device 100 for authenticating a user), line-in connector for data and/or power, and combinations thereof. Each input component 108 can be configured to provide one or more dedicated control functions for making selections or issuing commands associated with operating device 100.
Electronic device 100 may also include one or more output components 110 that may present information (e.g., graphical, audible, and/or tactile information) to a user of device 100. For example, output component circuitry 110 of electronic device 100 may take various forms, including, but not limited to, audio speakers, headphones, line-out connectors for data and/or power, visual displays, infrared ports, tactile/haptic outputs (e.g., rumblers, vibrators, etc.), and combinations thereof. As a particular example, electronic device 100 may include a display output component as output component 110, where such a display output component may include any suitable type of display or interface for presenting visual data to a user. A display output component may include a display embedded in device 100 or coupled to device 100 (e.g., a removable display). A display output component may include display driver circuitry, circuitry for driving display drivers, or both, and such a display output component can be operative to display content (e.g., media playback information, application screens for applications implemented on electronic device 100, information regarding ongoing communications operations, information regarding incoming communications requests, device operation screens, etc.) that may be under the direction of processor 102.
It should be noted that one or more input components and one or more output components may sometimes be referred to collectively herein as an input/output (“I/O”) component or I/O circuitry or I/O interface (e.g., input component 108 and output component 110 as I/O component or I/O interface 109). For example, input component 108 and output component 110 may sometimes be a single I/O component 109, such as a touch screen, that may receive input information through a user's touch (e.g., multi-touch) of a display screen and that may also provide visual information to a user via that same display screen.
Sensor circuitry 112 may include any suitable sensor or any suitable combination of sensors operative to detect movements of electronic device 100 and/or any other characteristics of device 100 or its environment (e.g., physical activity or other characteristics of a user of device 100). For example, sensor circuitry 112 may include any suitable sensor(s), including, but not limited to, one or more of a GPS sensor, accelerometer, directional sensor (e.g., compass), gyroscope, motion sensor, pedometer, passive infrared sensor, ultrasonic sensor, microwave sensor, a tomographic motion detector, a camera, a biometric sensor, a light sensor, a timer, or the like. In some examples, a biometric sensor may include, but is not limited to, one or more health-related optical sensors, capacitive sensors, thermal sensors, electric field (“eField”) sensors, and/or ultrasound sensors, such as photoplethysmogram (“PPG”) sensors, electrocardiography (“ECG”) sensors, galvanic skin response (“GSR”) sensors, posture sensors, stress sensors, photoplethysmogram sensors, and/or the like. While specific examples are provided, it should be appreciated that other sensors can be used and other combinations of sensors can be combined into a single device. In some examples, a GPS sensor or any other suitable location detection component(s) of device 100 can be used to determine a user's location and movement, as well as a displacement of the user's motion.
Communications circuitry 114 may be provided to allow device 100 to communicate with one or more other electronic devices or servers using any suitable communications protocol (e.g., with CM subsystem 300 and/or with SP subsystem 200 using communications set-up 95). For example, communications circuitry 114 may support Wi-Fi™ (e.g., an 802.11 protocol), ZigBee™ (e.g., an 802.15.4 protocol), WiDi™, Ethernet, Bluetooth™, Bluetooth™ Low Energy (“BLE”), high frequency systems (e.g., 900 MHz, 2.4 GHz, and 5.6 GHz communication systems), infrared, transmission control protocol/internet protocol (“TCP/IP”) (e.g., any of the protocols used in each of the TCP/IP layers), Stream Control Transmission Protocol (“SCTP”), Dynamic Host Configuration Protocol (“DHCP”), hypertext transfer protocol (“HTTP”), BitTorrent™, file transfer protocol (“FTP”), real-time transport protocol (“RTP”), real-time streaming protocol (“RTSP”), real-time control protocol (“RTCP”), Remote Audio Output Protocol (“RAOP”), Real Data Transport Protocol™ (“RDTP”), User Datagram Protocol (“UDP”), secure shell protocol (“SSH”), wireless distribution system (“WDS”) bridging, any communications protocol that may be used by wireless and cellular telephones and personal e-mail devices (e.g., Global System for Mobile Communications (“GSM”), GSM plus Enhanced Data rates for GSM Evolution (“EDGE”), Code Division Multiple Access (“CDMA”), Orthogonal Frequency-Division Multiple Access (“OFDMA”), high speed packet access (“HSPA”), multi-band, etc.), any communications protocol that may be used by a low power Wireless Personal Area Network (“6LoWPAN”) module, Near Field Communication (“NFC”), any other communications protocol, or any combination thereof. Communications circuitry 114 may also include or be electrically coupled to any suitable transceiver circuitry that can enable device 100 to be communicatively coupled to another device (e.g., a host computer or an accessory device) and communicate with that other device wirelessly, or via a wired connection (e.g., using a connector port). Communications circuitry 114 may be configured to determine a geographical position of electronic device 100. For example, communications circuitry 114 may utilize the global positioning system (“GPS”) or a regional or site-wide positioning system that may use cell tower positioning technology or Wi-Fi™ technology.
Processing circuitry 102 of electronic device 100 may include any processing circuitry that may be operative to control the operations and performance of one or more components of electronic device 100. For example, processor 102 may receive input signals from any input component 108 and/or sensor circuitry 112 and/or communications circuitry 114 and/or drive output signals through any output component 110 and/or communications circuitry 114. As shown in
Although not shown, device 100 may include any suitable secure credential component (e.g., NFC component, secure element, and/or the like) that may include or otherwise be configured to provide a tamper-resistant platform (e.g., as a single-chip or multiple-chip secure microcontroller) that may be capable of securely hosting applications and their confidential and cryptographic data in accordance with rules and security requirements that may be set forth by a set of well-identified trusted authorities (e.g., an authority of SP subsystem 200 and/or of CM subsystem 300 and/or of an industry standard, such as GlobalPlatform). Any suitable customer transaction credential information, such as CM credential information, may be stored in an applet on such a secure credential component of device 100 and may be configured to provide customer transaction credential data for use in any suitable service transaction order with a remote entity subsystem, such as SP subsystem 200. For example, the customer transaction credential data may provide an actual value source and/or may provide sufficient detail for identifying a funding account of CM subsystem 300 that may be used to as a value source, and the value source may be used to at least partially fund a service transaction between electronic device 100 and SP subsystem 200 for any suitable service provider service (e.g., any suitable good or service that may be provided on behalf of SP subsystem 200 for the benefit of a user of electronic device 100).
Electronic device 100 may also be provided with a housing 101 that may at least partially enclose one or more of the components of device 100 for protection from debris and other degrading forces external to device 100. In some embodiments, one or more of the components may be provided within its own housing (e.g., input component 108 may be an independent keyboard or mouse within its own housing that may wirelessly or through a wire communicate with processor 102, which may be provided within its own housing).
Although not shown, SP subsystem 200 may also include a processor component that may be the same as or similar to processor component 102 of electronic device 100, a communications component that may be the same as or similar to communications component 114 of electronic device 100, an I/O interface that may be the same as or similar to I/O interface 109 of electronic device 100, a bus that may be the same as or similar to bus 115 of electronic device 100, a memory component that may be the same as or similar to memory 104 of electronic device 100, and/or a power supply component that may be the same as or similar to power supply 106 of electronic device 100.
Although not shown, CM subsystem 300 may also include a processor component that may be the same as or similar to processor component 102 of electronic device 100, a communications component that may be the same as or similar to communications component 114 of electronic device 100, an I/O interface that may be the same as or similar to I/O interface 109 of electronic device 100, a bus that may be the same as or similar to bus 115 of electronic device 100, a memory component that may be the same as or similar to memory 104 of electronic device 100, and/or a power supply component that may be the same as or similar to power supply 106 of electronic device 100.
As shown in
An output component 110a may be a display that can be used to display a visual or graphic user interface (“GUI”) 180, which may allow a user to interact with electronic device 100. A screen 190 of GUI 180 may include various layers, windows, screens, templates, elements, menus, and/or other components of a currently running application (e.g., application 103) that may be displayed in all or some of the areas of display output component 110a. One or more of user input components 108a-108i may be used to navigate through GUI 180. For example, one user input component 108 may include a scroll wheel that may allow a user to select one or more graphical elements or icons 182 of GUI 180. Icons 182 may also be selected via a touch screen I/O component 109a that may include display output component 110a and an associated touch input component 108f. Such a touch screen I/O component 109a may employ any suitable type of touch screen input technology, such as, but not limited to, resistive, capacitive, infrared, surface acoustic wave, electromagnetic, or near field imaging. Furthermore, touch screen I/O component 109a may employ single point or multi-point (e.g., multi-touch) input sensing.
Icons 182 may represent various applications, layers, windows, screens, templates, elements, and/or other components that may be displayed in some or all of the areas of display component 110a upon selection by the user. Furthermore, selection of a specific icon 182 may lead to a hierarchical navigation process. For example, selection of a specific icon 182 may lead from screen 190 of
Electronic device 100 also may include various other I/O components 109 that may allow for communication between device 100 and other devices, such as a connection port 109b that may be configured for transmitting and receiving data files, such as media files or customer order files, and/or any suitable information (e.g., audio signals) from a remote data source and/or power from an external power source. For example, I/O component 109b may be any suitable port (e.g., a Lightning™ connector or a 30-pin dock connector available by Apple Inc.). I/O component 109c may be a connection slot for receiving a SIM card or any other type of removable component. Electronic device 100 may also include at least one audio input component 109g, such as a microphone, and at least one audio output component 110b, such as an audio speaker. Electronic device 100 may also include at least one tactile output component 110c (e.g., a rumbler, vibrator, haptic and/or taptic component, etc.), a camera and/or scanner input component 108h (e.g., a video or still camera, and/or a bar code scanner or any other suitable scanner that may obtain product identifying information from a code, such as a bar code, or the like), and a biometric input component 108i (e.g., a fingerprint reader or other feature recognition sensor, which may operate in conjunction with a feature-processing application that may be accessible to electronic device 100 for authenticating a user).
Referring now to
SMP broker component 240 of SP subsystem 200 may be configured to manage customer authentication with an SP customer account of SP subsystem 200 and/or to manage CM validation with a CM subsystem account of SP subsystem 200. SMP broker component 240 may be a primary end point that may control certain interface elements (e.g., elements of a GUI 180) on device 100. A CM application of CM subsystem 300 may be configured to call specific application programming interfaces (“APIs”) and SMP broker 240 may be configured to process requests of those APIs and respond with data that may derive a portion of a user interface that may be presented by CM subsystem 300 (e.g., to device 100) and/or respond with application protocol data units (“APDUs”) that may communicate with CM subsystem 300. Such APDUs may be received by SP subsystem 200 from CM subsystem 300 via a trusted services manager (“TSM”) of system 1 (e.g., a TSM of a communication path between SP subsystem 200 and CM subsystem 300). In some particular embodiments, SMP TSM component 250 of SP subsystem 200 may be configured to provide Global Platform-based services or any other suitable services that may be used to carry out credential provisioning operations on device 100 from CM subsystem 300. GlobalPlatform, or any other suitable secure channel protocol, may enable SMP TSM component 250 to properly communicate and/or provision sensitive account data between a secure element of device 100 and a TSM for secure data communication between SP subsystem 200 and a remote subsystem.
SMP TSM component 250 may be configured to use HSM component 290 to protect keys and generate new keys. SMP crypto services component 260 of SP subsystem 200 may be configured to provide key management and cryptography operations that may be provided for user authentication and/or confidential data transmission between various components of system 1 (e.g., between SP subsystem 200 and CM subsystem 300 and/or between SP subsystem 200 and device 100 and/or between different components of SP subsystem 200). SMP crypto services component 260 may utilize HSM component 290 for secure key storage and/or opaque cryptographic operations. A payment crypto service of SMP crypto services component 260 may be configured to interact with IDMS component 270 to retrieve information associated with on-file credit cards or other types of customer transaction credentials associated with user accounts of the SP (e.g., an Apple iCloud™ account). Such a payment crypto service may be configured to be the only component of SP subsystem 200 that may have clear text (e.g., non-hashed) information describing customer transaction credentials (e.g., credit card numbers) of its user accounts in memory. Fraud system component 280 of SP subsystem 200 may be configured to run an SP fraud check on a customer transaction credential based on data known to the SP about the transaction credential and/or the customer (e.g., based on data (e.g., customer transaction credential information) associated with a customer account with the SP and/or any other suitable data that may be under the control of the SP and/or any other suitable data that may not be under the control of a remote subsystem). Fraud system component 280 may be configured to determine an SP fraud score for the credential based on various factors or thresholds. Additionally or alternatively, SP subsystem 200 may include store 265, which may be a provider of various services to users of device 100 (e.g., the iTunes™ Store for selling/renting media to be played by device 100, the Apple App Store™ for selling/renting applications for use on device 100, the Apple iCloud™ Service for storing data from device 100 and/or associating multiple user devices and/or multiple user profiles with one another, the Apple Online Store for buying various Apple products online, the Apple Music™ Service for enabling subscriptions of various streaming services, etc.). As just one example, store 265 may be configured to manage and provide an application 103 and/or application 103a to device 100, where the application may be any suitable application, such as a CM application (e.g., a banking application), an SP application (e.g., a music streaming service application), an e-mail application, a text messaging application, an internet application, a credential management application, or any other suitable communication application, and/or to provide any suitable SP product to a customer (e.g., a media file to memory 104 of customer device 100, etc.).
Server 210 may be any suitable server that may be operative to handle any suitable services or functionalities of SP subsystem 200. In other embodiments, at least a portion or the entirety of server 210 may be an independent subsystem distinct from SP subsystem 200 (e.g., as a third party subsystem of a third party that may be distinct from the SP of SP subsystem 200 or as another subsystem provided by the SP of SP subsystem 200 that may be distinct from SP subsystem 200). As shown in
Some or all models generated or otherwise trained or built by model builder 218 may be collected by a model repository 232, while a best model identifier 234 may be operative to identify the best performing model(s) of model repository 232 using any suitable techniques (e.g., model identifier 234 may identify a best performing model for each one of the various types of models available to system 1 at a particular moment (e.g., a best performing ARAP model, etc.), each of which may be the same or different type of machine learning model). Then, when one or more particular types of model is to be used to customize an authorization request schedule for a particular customer for a particular purchase transaction, a model request server 236 may be operative to receive from any suitable source (e.g., any suitable client source for server 210 (e.g., store 265)) a request 239 for such model(s) that may include model request data 239d (e.g., any suitable transaction data associated with a particular customer and/or a particular transaction of that customer, including, but not limited to, any suitable UF data, BF data, TF data, a “dsid” (e.g., a unique customer identifier for the particular customer and/or a customer score indicative of some trustworthiness or fraud score or otherwise that may be associated with the customer), a “store_front_id” (e.g., an identifier of the particular store that received the transaction purchase attempt from the customer (e.g., a particular app store, a particular music subscription service, etc.)), a “buy_amount” (e.g., a value amount of the SP product(s) to be purchased in the transaction purchase attempt), “buys_last_x_hours” (e.g., number of purchases attempted or authorized for the particular customer during the last X hours), “card_type” (e.g., type of transaction credential being used for the transaction purchase attempt), “flow_step” (e.g., relative operation within a customization flow at which request 239 was generated (e.g., which operation (e.g., operation 608) and/or the like of process 600 of
In response to receiving such a request 239, model request server 236 may be operative to use some or all of model request data 239d as input data to one or more of the models available to model request server 236 (e.g., one or more of the best ARAP models of best model identifier 234) in order to receive appropriate model output data 205d (e.g., any suitable model output data that may be used to customize an authorization request schedule experience for the particular customer, including, but not limited to, an authorization request approval probability (“ARAP”) score for each one of any suitable number of potential request times (e.g., a value between 0 (e.g., indicative of a 0% chance of success of the authorization request if made at the particular potential request time) and 1 (e.g., indicative of a 100% chance of success of the authorization request if made at the particular potential request time)) for the particular customer's transaction as may be determined by a best performing ARAP model available to model request server 236 using at least some transaction data of model request data 239d as model input data (e.g., as may be used by operation 608 of process 600)), and/or the like).
Any or all of such model output data 205d that may be received by model request server 236 for a particular request 239 may be provided as at least a portion of any suitable model response data 237d for a model response 237 that may be returned by server 210 to any suitable target (e.g., the client source (e.g., store 265) that may have provided request 239). As shown, in addition to any suitable model output data 205d, model response data 237d may include any suitable additional model response data that may help facilitate customization and/or use of an authorization request schedule for the particular customer's transaction, including, but not limited to, a “dsid” (e.g., the unique customer identifier for the particular customer and/or a customer score indicative of some trustworthiness or fraud score or otherwise that may be associated with the customer (e.g., the same identifier as in request 239, which may facilitate linking request 239 and response 237)), a “timestamp” (e.g., any suitable timestamp indicative of the time at which all or any suitable portion of response data 237d may have been generated and/or the potential time with which a particular ARAP score is associated), a “model_version” (e.g., any suitable data that may be indicative of a particular model of server 210 that may have been used to generate at least a portion of data 205d and/or of a type of such a model and/or the like, which may be used for diagnostic purposes or any other suitable purposes, where such optional data may be exposed to the client and may be useful, for example, when the client would like to determine how and/or when the model output evolved, where an A/B test or experiment or otherwise might be conducted based on such data by the client or otherwise), and/or the like.
As also shown in
Therefore, streaming job model builder 228 may be operative to update one or more batch job models using only particular feature types of transaction data 225 from data 221 of queue database 222, where such updating by builder 228 may be accomplished with limited overhead and/or limited processing in a more efficient manner (e.g., as only a limited set of features of a limited set of transaction data may be used to retrain one or more models). Thus, modules 220, 222, 224, 226, 228, and/or 230 may be operative to update and/or improve the training of one or more models based on real-time or near-real time data in an efficient manner (e.g., by focusing on only certain features of transaction data that may be of significant importance to the effectiveness of the model(s) and/or by only using newly generated transaction data). This may enable a certain type of model, such as an ARAP model, to be re-trained or otherwise refreshed using transaction data generated or otherwise first made available to server 210 (e.g., via data 221) during an active authorization request schedule for which that refreshed model may then be used to make a determination on an updated characteristic (e.g., an updated probability) for that active authorization request schedule (e.g., at operation 608 of process 600 (e.g., periodically and/or in response to any new relevant transaction data being received for the particular customer)).
Any suitable API(s) may be used between any two communicating entities of system 1. Store 265 may call an API endpoint with a request 239 to retrieve a response, and the API response to the call may be a response 237 from server 210. Additionally or alternatively, server 210 may call an API endpoint with a request for any suitable data 211 and/or data 332 from any suitable data source, and the API response to the call may be any suitable transaction data 211 and/or 221 or otherwise from any suitable data source.
Thus, one or more models 205 managed by server 210 may be operative to effectively and efficiently determine an appropriate authorization request schedule for a particular customer in a particular transaction situation. For example, such a model or learning engine may include any suitable neural network (e.g., an artificial neural network) that may be initially configured, trained on one or more sets of scored (e.g., authorized and/or rejected) transaction data for one or more past transactions (e.g., individual and/or aggregated transactions by any customers or particular customer(s)), and then used to predict an appropriate characteristic or eligibility or authorization request approval probability of a customized authorization request schedule experience for a particular customer in a particular transaction situation.
A neural network or neuronal network or artificial neural network or any other suitable type of model for use in managing one or more models may be hardware-based, software-based, or any combination thereof, such as any suitable model (e.g., an analytical model, a computational model, etc.), which, in some embodiments, may include one or more sets or matrices of weights (e.g., adaptive weights, which may be numerical parameters that may be tuned by one or more learning algorithms or training methods or other suitable processes) and/or may be capable of approximating one or more functions (e.g., non-linear functions or transfer functions) of its inputs. The weights may be connection strengths between neurons of the network, which may be activated during training and/or prediction. A neural network may generally be a system of interconnected neurons that can compute values from inputs and/or that may be capable of machine learning and/or pattern recognition (e.g., due to an adaptive nature). A neural network may use any suitable machine learning techniques to optimize a training process. The neural network may be used to estimate or approximate functions that can depend on a large number of inputs and that may be generally unknown. The neural network may generally be a system of interconnected “neurons” that may exchange messages between each other, where the connections may have numeric weights (e.g., initially configured with initial weight values) that can be tuned based on experience, making the neural network adaptive to inputs and capable of learning (e.g., learning pattern recognition). A suitable optimization or training process may be operative to modify a set of initially configured weights assigned to the output of one, some, or all neurons from the input(s) and/or hidden layer(s). A non-linear transfer function may be used to couple any two portions of any two layers of neurons, including an input layer, one or more hidden layers, and an output (e.g., an input to a hidden layer, a hidden layer to an output, etc.).
Different input neurons of the neural network may be associated with respective different types of features or categories of transaction data and may be activated by transaction feature data of the respective transaction feature (e.g., each possible category or feature of BF transaction data, each possible category or feature of UF transaction data, each possible category or feature of TF transaction data, each possible category or feature of graph based transaction data, and/or the like may be associated with one or more particular respective input neurons of the neural network and transaction feature data for the particular transaction feature may be operative to activate the associated input neuron(s)). The weight assigned to the output of each neuron may be initially configured using any suitable determinations that may be made by a custodian or processor of the model (e.g., server 210) based on the data available to that custodian.
The initial configuring of the learning engine or model (e.g., the initial weighting and arranging of neurons of a neural network of the learning engine) may be done using any suitable data accessible to a custodian of the model. For example, a model custodian may be operative to capture any suitable initial background data about a particular customer or all known customers or a subset of all known customers or all known transactions or a subset of all known transactions as well as the result or truth for each transaction (e.g., authorized or rejected) in any suitable manner from any suitable sources (e.g., SP subsystem 200, one or more CM subsystems 300, one or more customer devices 100, one or more third party subsystems, or the like). Any suitable training methods or algorithms (e.g., learning algorithms) may be used to train the neural network of the learning engine, including, but not limited to, Back Propagation, Resilient Propagation, Genetic Algorithms, Simulated Annealing, Levenberg, Nelder-Meade, and/or the like. Such training methods may be used individually and/or in different combinations to get the best performance from a neural network. A loop (e.g., a receipt and train loop) of receiving transaction feature data and a result/truth for a transaction and then training the model using the received transaction feature data and result/truth may be repeated any suitable number of times for more effectively training the learning engine, where the received transaction feature data and the received result/truth received of different receipt and train loops may be for different customers or for the same customer (e.g., for different transactions) and/or may be received from the same source or from different sources. The number and/or type(s) of the one or more types of transaction features for which transaction feature data may be received for one receipt and train loop may be the same or different in any way(s) than the number and/or type(s) of the one or more transaction features for which transaction feature data may be received for a second receipt and train loop.
At operation 608, an ARAP model may be utilized to predict an ARAP (e.g., an ARAP score) for each potential request time for the particular transaction (e.g., using any suitable ARAPM 601m of server 210 (e.g., as trained at operation 601) using any suitable request 239 and response 237 communications between store 265 and server 210 (e.g., using any suitable transaction feature data that may be specific to the particular customer and/or the particular transaction for which a schedule is being customized (e.g., “dsid” and/or “store_front_id” and/or “buy_amount” and/or “buys_last_x_hours” and/or “card_type” and/or “flow_step” and/or “model_version” and/or any suitable BF and/or any suitable TF and/or any suitable UF and/or the like))). For example, at operation 608, various suitable model input features of the particular transaction and for a particular potential request time may be provided as inputs to a previously trained ARAP model, where such model input features may include one or more various types of UF data, BF data, and/or TF data associated with the particular transaction and a particular potential request time t, and the ARAP model may be operative to provide an output probability P(t) based on those model input features, where such output probability P may be any suitable value between 0 and 1 for that particular transaction at that particular potential request time (e.g., the model may be operative to compute an output probability P(t) of a value 0 that may be indicative of a prediction of a 0% probability of success for an authorization request made at the particular potential request time t, an output probability P(t) of a value 1 that may be indicative of a prediction of a 100% probability of success for an authorization request made at the particular potential request time t, or an output probability P(t) of any value between 0 and 1 that may be indicative of a prediction of any other suitable percentage probability of success for an authorization request made at the particular potential request time t). Operation 608 may be carried out to generate such an output probability P(t) for every potential request time t identified at operation 606.
At operations 610-620, process 600 may be operative to use a statistical model (e.g., statistical model SM 610m) that may be operative to calculate an expected gain for attempting an authorization request at each potential request time t using the output probability P(t) predicted at operation 608 for every potential request time t identified at operation 606 in conjunction with one or more cost constraints. For example, an expected gain E to be realized by the SP subsystem if an authorization request were to be made at a particular potential request time t (i.e., E(t)) may be defined to equal P(t)*A−F+(1−P(t))*E(t+1), where A may be the amount to be paid to the SP subsystem if the authorization request were successful (e.g., the purchase price for the transaction), and where F may be the overhead cost (e.g., transaction cost and/or operation cost) incurred by the SP subsystem for making an authorization request, and where E(t+1) may be the expected gain calculated for a particular request time t+1 that may be consecutively after the particular request time t. Therefore, the expected gain E(t) for a particular time t may be defined to be dependent on a future expected gain with respect to that particular time f (e.g., dependent on the expected gain at time t+1 (i.e., E(t+1))). This may be why the expected gain calculation may first be made from the last time tT and then calculated in reverse order by time (e.g., an initial potential request time t that may be selected for use at an initial iteration of operation 610 may be time tT, and then the next future iteration of operation 610 (e.g., after operation 619) may select time tT−1). For example, A may be $10.00 and F may be $1.00, such that if P(t) was predicted to be 0.4 by the ARAP model, and such that if E(t+1) may be determined to be $5.00, then the expected gain E(t) may be determined by the statistical model to be $6.00. While the ARAP model may be trained over a long period of time using a lot of historical data based on the success or failure of many past authorization requests, the statistical model's use of various cost values (e.g., A, F, or any other suitable cost variables) and/or the values of those various cost values (e.g., $10.00, $1.00, etc.) may be updated at any suitable moment (e.g., as costs associated with authorization requests and values of various transactions may change in many ways at any suitable time). Therefore, the combination of two distinct models, such as an ARAP model and a statistical model, for determining an expected gain at a particular potential request time may provide a hybrid model solution that enables the customization of an ARAP model with the ability to easily fine tune and adjust characteristics of a statistical model. In some embodiments, the expected gain peaks throughout the potential request times of the grace period may be identified, and only the peaks may be added (e.g., at operation 616) to the customized authorization request schedule (e.g., at various iterations of operation 616), where an expected gain peak may be identified as a request at a potential request time t, where E(t) (e.g., as determined at operation 610) is not less than or is greater than each one of E(t+1) and E(t−1) (e.g., as determined at operations 612 and 614).
In some embodiments, after each one of E(t) and E(t+1) has been calculated (e.g., at operations 610 and 612, respectively, for a selected potential request time t), a determination may be made (e.g., at operation 614) as to which of the following two expected values is greater: (i) E(t) or (ii) E(t+1), and, if (i) is determined to be greater than (ii), then process 600 may proceed from operation 614 to operation 616 where potential request time t may be added to an initial customized authentication request schedule before proceeding to operation 620, otherwise, if (i) is determined to not be greater than (ii), then process 600 may proceed from operation 614 to operation 618 where potential request time t may not be added to an initial customized authentication request schedule before proceeding to operation 620. This may enable process 600 to add a particular selected potential request time t to an initial customized authentication request schedule only when the expected gain of E(t+1) for trying at a consecutively later potential request time t+1 is less than the combined expected gain of P(t)*A−F+E(t+1)*(1−P(t)) for trying at the particular selected potential request time t and for trying at the consecutively later potential request time t+1 if the attempt at time t was unsuccessful (e.g., try at selected time t only if the expected gain when trying at time t+1 is less than the combined expected gain of trying at time/and trying at time t+1 only if the attempt at time t was unsuccessful).
As just one example of such a comparison, where a currently selected potential request time t that is request time 120 out of 240 potential request times is determined to have an ARAP [P(t120)] of 0.4 (e.g., at operation 608), where the consecutively next potential request time 121 out of 240 potential request times is determined to have an expected gain E(t+1) of $5.00 (e.g., at a previous iteration of operation 610), where an expected gain [E(t120)] may be defined by process 600 to be calculated as E(t)=P(t)*A−F+(1−P(t))*E(t+1), where A is defined to be $10.00, and where F is defined to be $1.00, such that E(t120) may be determined to be $6.00 (e.g., at a current iteration of operation 610) and such that E(t121) may be determined to be $5.00 (e.g., at a current iteration of operation 612 and/or previous iteration of operation 610), then process 600 may be operative to determine (e.g., at operation 614) that (ii) E(t121) of $5.00 is less than (i) E(t120) of $6.00 [0.4*$10.00−$1.00+(1−0.4)*$5.00] (i.e., that an expected gain of $5.00 for only trying at t121 is less than an expected gain of $6.00 for trying at t120 and then at t121 only when the attempt at 1120 is unsuccessful), such that process 600 may advance from operation 614 to operation 620 via operation 616 rather than via operation 618, such that selected potential request time t120 is added to the initial customized authorization request schedule. As just one other example, when process 600 may advance to operation 620 after determining that E(t121) is less than E(t120) when particular potential request time t120 was selected, as described above, operation 620 may determine that not every potential request time t has been selected and may advance to operation 619 where a next (e.g., consecutively earlier) potential request time t119 may be selected before returning to another iteration of operations 610, 612, and 614 during which t is t119 and t+1 is t120. In such another iteration of operations 610, 612, and 614, where a selected potential request time t that is request time 119 out of 240 potential request times is determined to have an ARAP [P(t119)] of 0.1 (e.g., at operation 608), where the consecutively next potential request time 120 out of 240 potential request times is determined to have an expected gain E(t+1) of $6.00 (e.g., at a previous iteration of operation 610, as described above), where an expected gain [E(t119)] may be defined by process 600 to be calculated as E(t)=P(t)*A−F+(1−P(t))*E(t+1), where A is defined to be $10.00, and where F is defined to be $1.00, such that E(t119) may be determined to be $5.40 (e.g., at a current iteration of operation 610) and such that E(t120) may be determined to be $6.00 (e.g., at a current iteration of operation 612 and/or previous iteration of operation 610), then process 600 may be operative to determine (e.g., at operation 614) that (ii) E(t120) of $6.00 is greater than (i) E(t119) of $5.40 [0.1*$10.00−$1.00+(1−0.1)*$6.00] (i.e., that an expected gain of $6.00 for only trying at t120 is greater than an expected gain of $5.40 for trying at t119 and then at t120 only when the attempt at t119 is unsuccessful), such that process 600 may advance from operation 614 to operation 620 via operation 618 rather than via operation 616, such that selected potential request time t120 is not added to the initial customized authorization request schedule.
The first overall iteration of operation 610 may be operative to select the final potential request time tT (e.g., time t240) and then each iteration of operation 619 may be operative to select the potential request time that is consecutively prior to the previously selected potential request time (e.g., time t239, then time t 238, . . . , then time t 2, and then time t1), for example, such that process 600 may work backwards through the set of all potential request times t1−tT as may be identified at operation 606. However, in such embodiments, process 600 may be operative to uniquely handle operation 612 during the first iteration of operations 610-620, as it may not be possible to routinely determine E(t+1) when the currently selected potential request time t is the final potential request time tT, whereby process 600 may be operative at such an initial iteration of operation 612 to calculate such an E(tT+1) to be equal to a churn cost −C, where churn cost −C may be the amount to be lost by the SP subsystem if it were unable to successfully receive authorization to fund the particular customer transaction (e.g., if the authorization request made by the SP subsystem at each request time of the customized authorization request schedule were to be unsuccessful, such that the customer may churn). As just one example, where A may be $10.00 and F may be $1.00 and C may be $60.00 (e.g., as may be defined in any suitable manner (e.g., by the SP subsystem)), if P(tT) was predicted to be 0.9 by the ARAP model, then the expected gain E(tT+1) may be determined by the statistical model (e.g., at operation 612) to be −$60.00 and E(tT) may be determined (e.g., at operation 610) to be $2.00, then process 600 may be operative to determine (e.g., at operation 614) that (ii) E(tT+1) of −$60.00 is less than (i) E(IT) of $2.00 [0.9*$10.00−$1.00+(1−0.9)*−$60.00](i.e., that an expected gain of −$60.00 for only trying at tT+1 is less than an expected gain of $2.00 for trying at tT and then at tT+1 only when the attempt at IT is unsuccessful), such that process 600 may advance from operation 614 to operation 620 via operation 616 rather than via operation 618, such that selected potential request time tT is added to the initial customized authorization request schedule.
It is to be understood that the manner in which an expected gain for a particular time t may be determined based on an ARAP P(t) (e.g., P(t)*A−F+(1−P(t))*E(t+1)) may be any suitable calculation or determination using any suitable equation or function of P(t) and/or using any suitable variable cost elements, such as A, F, C, and/or the like of any suitable values, which may be changed over time as the SP subsystem may see fit to best determine an expected gain based on changing operation costs and/or transaction costs and/or chum costs and/or the like, using any suitable statistical model(s) at operations 610 and/or 612 that may be completely independent from how an ARAP model may be configured and/or trained and/or utilized for determining an ARAP value for a particular time t, such that a flexible and tunable hybrid model solution may be realized. Additionally or alternatively, it is to be understood that the manner in which a calculated expected gain E(t) for a particular selected potential request time t may be compared (e.g., at operation 614) to any suitable comparator, such as to an expected gain for a consecutive other potential request time (e.g., E(t−1) and/or E(t+1)) and/or to any suitable threshold, in order to determine selectively whether or not that particular selected potential request time t ought to be added to an initial customized authorization request schedule, may be any suitable calculation or determination using any suitable equation or function or comparison (e.g., (A) add t to schedule if E(t−1)<E(t)>E(t+1) (e.g., try at selected time t when E(t) is a peak of some sort), (B) add t to schedule if E(t)>E(I+1) (e.g., try at selected time t if the expected gain when trying at time t+1 is less than the expected gain of trying at time t, where E(Q) may be calculated as P(I)*A−F+(1−P(t))*E(t+1), such that the current expected gain E(I) may be dependent on the future expected gain E(t+1), which may be why an expected gain calculation may be made from the last time T and then calculated in reverse order by time), (C) add t to schedule if E(t)>any suitable threshold (e.g., a fixed threshold or any suitable threshold calculated in any suitable manner based on all the calculated E(t)'s of all potential request times), and/or the like).
Once an initial authorization request schedule has been defined based on each one of the T iterations of operations 610, 612, 614, 616 or 618, 619, and 620 (e.g., based on considering the expected gain of each one of the potential request times), process 600 may advance from operation 620 to operation 625, where any suitable constraints or filters may be applied to the initial authorization request schedule. For example, one or more constraints or filters may be applied at operation 625 for reducing the number of request times of the authorization request schedule by eliminating some of the request times that may define the initial authorization request schedule. For example, operation 625 may be operative to limit the number of request times in an authorization request schedule to no more than N request times, where the value of N may be determined in any suitable manner, such as by an equation where N<((A*R)/F), where A may be the amount to be paid to the SP subsystem if any authorization request of the schedule were successful (e.g., the purchase price for the transaction), where F may be the overhead cost (e.g., transaction cost and/or operation cost) incurred by the SP subsystem for making an authorization request, and where R may be a fee ratio desired by the SP subsystem. As just one example, if the initial authorization request schedule defined prior to operation 625 (e.g., during a schedule computation subprocess 691 of process 600) includes 8 request times of the 240 potential request times identified at operation 606, and if A may be $10.00 and F may be $1.00 and R may be 0.5, then N may be calculated to be less than 5 (e.g., less than $10.00*0.5/$1.00 (e.g., 4)), such that only 4 of the 8 request times of the initial authorization request schedule may be maintained (e.g., after operation 625). Operation 625 may be operative to remove the appropriate number of request times in any suitable manner, such as by removing the request times associated with the lowest expected gains. It is to be understood that the manner in which any suitable number of request times may be removed in any suitable manner from an initial customized authorization request schedule may be determined using any suitable calculation(s) or determination(s) using any suitable variable cost elements, such as A, F, C, R, and/or the like of any suitable values, which may be changed over time as the SP subsystem may see fit to best determine an any suitable constraints or filters (e.g., business constraints) based on changing operation costs and/or transaction costs and/or churn costs and/or SP goals and/or the like, using any suitable strategic (e.g., business) model(s) at operation 625 that may be completely independent from how an ARAP model may be configured and/or trained and/or utilized for determining an ARAP value for a particular time t, and/or that may be completely independent from how any potential request times are selectively added to an initial customized authorization request schedule, such that a flexible and tunable hybrid model solution may be realized.
Once a customized authorization request schedule has been initially defined (e.g., at subprocess 691) and optionally filtered (e.g., at operation 625), a customized authorization request schedule for a particular customer's transaction may be executed at a schedule execution subprocess 699. For example, operation 620 may proceed (e.g., via operation 625) to an operation 630, at which process 600 may determine whether the current time matches a scheduled time of the customized authorization request schedule. If not, operation 630 may be repeated until a match is determined. Once a match is determined, process 600 may proceed from operation 630 to an operation 632, at which an authorization request for the particular customer's transaction may be requested (e.g., an authorization request may be communicated between SP subsystem 200 and an appropriate CM subsystem 300). If that authorization request of operation 632 is determined to have been successful at operation 634, then process 600 may proceed to operation 636, at which the scheduled payment of the particular customer's transaction may be deemed complete and the remainder (if any) of the customized authorization request schedule may be ignored. Alternatively, if that authorization request of operation 632 is determined to have been unsuccessful at operation 634, then process 600 may return to operation 630, at which process 600 may again determine whether the current time matches a scheduled time of the customized authorization request schedule (e.g., the next scheduled time of the customized authorization request schedule).
It is understood that the operations shown in process 600 of
One, some, or all of the processes described with respect to
It is understood that any, each, or at least one module or component or subsystem of system 1 may be provided as a software construct, firmware construct, one or more hardware components, or a combination thereof. For example, any, each, or at least one module or component or subsystem of system 1 may be described in the general context of computer-executable instructions, such as program modules, that may be executed by one or more computers or other devices. Generally, a program module may include one or more routines, programs, objects, components, and/or data structures that may perform one or more particular tasks or that may implement one or more particular abstract data types. It is also to be understood that the number, configuration, functionality, and interconnection of the modules and components and subsystems of system 1 are only illustrative, and that the number, configuration, functionality, and interconnection of existing modules, components, and/or subsystems may be modified or omitted, additional modules, components, and/or subsystems may be added, and the interconnection of certain modules, components, and/or subsystems may be altered.
At least a portion of one or more of the modules or components or subsystems of system 1 may be stored in or otherwise accessible to an entity of system 1 in any suitable manner (e.g., in memory 104 of device 100 (e.g., as at least a portion of an application 103 and/or as at least a portion of an application 103a)). Any or all of the modules or other components of system 1 may be mounted on an expansion card, mounted directly on a system motherboard, or integrated into a system chipset component (e.g., into a “north bridge” chip).
Any or each module or component of system 1 may be a dedicated system implemented using one or more expansion cards adapted for various bus standards. For example, all of the modules may be mounted on different interconnected expansion cards or all of the modules may be mounted on one expansion card. Any or each module or component of system 1 may include its own processing circuitry and/or memory. Alternatively, any or each module or component of system 1 may share processing circuitry and/or memory with any other module.
As described above, one aspect of the present technology is the gathering and use of data available from various sources to customize authorization request schedules. The present disclosure contemplates that in some instances, this gathered data may include personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, social network identifiers, home addresses, office addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information, etc.) or purchase history, date of birth, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information data, in the present technology, can be used to the benefit of users. The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the United States, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (“HIPAA”); whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of location detection services, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In addition to providing “opt in” or “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an app that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing such personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data. For example, customizing authorization request schedules can be made based on non-personal information data or a bare minimum amount of personal information, such as the content being requested by the device associated with a user, other non-personal information available to the device, or publicly available information.
While there have been described systems, methods, and computer-readable media for customizing authorization request schedules, it is to be understood that many changes may be made therein without departing from the spirit and scope of the subject matter described herein in any way. Insubstantial changes from the claimed subject matter as viewed by a person with ordinary skill in the art, now known or later devised, are expressly contemplated as being equivalently within the scope of the claims. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the defined elements.
Therefore, those skilled in the art will appreciate that the invention can be practiced by other than the described embodiments, which are presented for purposes of illustration rather than of limitation.
This application claims the benefit of prior filed U.S. Provisional Patent Application No. 62/689,654, filed Jun. 25, 2018, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
62689654 | Jun 2018 | US |