The subject matter disclosed herein generally relates to entity authentication and identifying abnormalities to measure transaction risk for use on a user device.
Digital transactions of a variety of types may stem not from a party authorized to enter in to the transaction but by parties that are either unauthorized to enter in to the transaction or bad actors and network bots who have acquired the means to enter in to the transaction illegally from a hostile environment. The hostile environment that may have resulted from a Denial of Service (DOS) attack from sources such as User Datagram Protocol (UDP) flooding, Internet Control Message Protocol (ICMP) flooding, and/or Portscan. For instance, a stolen credit card number or bank account access may be utilized to make fraudulent purchases or transactions-exchanges. A stolen or compromised password may be utilized to improperly access information. Even conventional purchases or activities within an organization may be engaged in by an employee or member who does not have authorization to do so. In these cases, certain purchases or transactions may be initiated by one party, such as a regular employee, but require authorization by a second party, such as a manager.
Aspects of the disclosure include a system for using user entity behavior based information for providing and restricting access to a secure computer network comprising: a processor coupled to a network interface, the processor configured to: capture contextual factors of a user entity interacting with a mobile device, wherein the contextual factors include user entity behavior, characteristics of device, characteristics of browser and network traffic; receive a transaction request from the mobile device; calculate a transaction risk and confidence score for the transaction request based on the contextual factors; and compare the transaction risk and confidence score to a predetermined threshold risk score to determine whether the transaction request is approved. In some embodiments, a context aware risk based approach is used. Aspects of the disclosure further include a system for using user entity context and behavior information for providing and restricting access to a secure computer network comprising: a plurality of processors coupled to a network interface, the processors configured to: capture behavioral and contextual factors of a user entity interacting with a client device used to access cloud services, wherein the behavioral and contextual factors include the user entity behavior and habits, client device characteristics, client device browser characteristics and network traffic; calculate a transaction risk and confidence score of a user entity, the client device and the client device browser for a transaction request based on the user entity behavior and habits and client device characteristics; and compare the transaction risk and confidence score to a predetermined threshold required risk score to determine whether the transaction request is automatically approved and if not approved, send an out of band authentication request to the user entity device to authenticate and authorize access.
Some embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings.
Several illustrative embodiments will now be described with respect to the accompanying drawings, which form a part hereof. While particular embodiments, in which one or more aspects of the disclosure may be implemented, are described below, other embodiments may be used and various modifications may be made without departing from the scope of the disclosure or the spirit of the appended claims.
Below are example definitions that are provided only for illustrative purposes in this disclosure below and should not be construed to limit the scope of the embodiments disclosed herein in any manner.
Allocentric: in context of transaction it is the other users, devices, or applications and transactions within the overall system in which access and transaction of interest are observed and not necessarily binded to the actual transaction but the concurrent transaction present in the system. Good examples are observation of the traffic in a system independent of the initiated transactor buy the actor of interest but other actors impacting the system load and hence indirectly impacting the current transaction of interest such as Transmission Control Protocol (TCP) synchronize (SYN), Internet Control Message Protocol (ICMP) and user datagram protocol (UDP) flooding, and portscanning, payload signature of system, number of transactions, common IPs, abnormal vs normal behaviors of transaction other than current subject and context of transaction of interest.
Active Session: a user entity is validly logged into an identity provider such as a Relying Party (RP) Services Application.
Attributes: unique identification information associated with a user entity (defined below) such as biometric information, spatial, location and/or behavior, device, browser and network context.
Authentication Assurance: the degree of confidence reached in the authentication process that the communication partner (human or machine) is the user entity that it claims to be or is expected to be. The confidence may be based on the degree of confidence in the binding between the communicating entity and the user identity that is presented.
Egocentric: in context of both physical and cyber transaction is the relation of user, device, or application to the overall system objects and transactions relative to each other vs the egocentric view.
Significant Events: a defined normal (or abnormal) event of interest defined by the policy engine or through the artificial intelligence/machine learning (AIML) cognitive engine that can trigger a condition of interest that demands for change in LOA required (dynamic level of assurance) in realtime initiating a need for response to authenticate, authorize, audit or even denial of service where appropriate.
SIEM: security information and event management is the aggregate security information management and security event management functions into one system to aggregate relevant data from multiple sources, identify deviations from the defined norms and provide an early warning or even take appropriate action as needed to inform enterprise infosec and IT experts of possible threat post event or during.
Spatiotemporal Velocity: user transaction, access and login inference based on time and location and scoring based on proximity, distance of travel and time feasibility.
Contextual Identifiers (or Contextual Factors): may be part of the verification process and may include the following multifactors used singularly or in different combinations: location, biometrics (e.g., heartbeat monitoring, iris recognition, fingerprint, voice analysis, deoxyribonucleic acid (DNA) testing), user entity habits, user entity location, spatial, body embedded devices, smart tattoos, dashboard of user's car, user's television (TV), user's home security digital fingerprint, Domain Name System (DNS), device, browser and network context, remote access Virtual Private Network (VPN), Application usage and habits, data sharing and access fingerprint and the like.
Credentials: may take several forms, including but not limited to: (a) personally identifiable information such as name, address, birthdate, etc.; (b) an identity proxy such a username, login identifier (user name), or email address; (c) some biometric identifiers such as fingerprint or voice, face, etc.; (d) an X.509 digital certificate; and/or (e) a digital fingerprint and approval from a user-binded device, f) behavioral habits of device or user of the device in physical or cyber space, g) behavior of network and applications at the time of user interface with the application and network.
Fingerprinting: collection of Attributes (including actual biometric and device, browser, habits of user and machine on the network and associated fingerprints) that help identify the authentic user and device and system entity.
Identity Assurance: the degree of confidence in the process of identity validation and verification used to establish the identity of the user entity to which the credential was issued and the degree of confidence that the entity that uses the credential is that entity or the entity to which the credential was issued or assigned.
Level of Assurance (LOA): a level of confidence for identity proofing for the binding between level of access for an entity and the presented identity information.
Real Time: the time associated with authorization periods described herein which range depending on the type of transaction, need and urgency for authorization. The authorization time periods may vary from under 10 seconds to 24 hours or more. Real time authorization as used herein prevents fraud at its inception versus mitigating it in a post event notification. In one implementation, real time may refer to the time for the transaction to complete.
Relying Party (RP): could be an entity in multiple sectors requiring secure interactions such as financial institutions, healthcare, retailers, education institutions, government agencies and associated social services, social networks, websites, and the like. A Relying Party will typically use a server (e.g., the RP Server) as a manifestation of its intentions.
Relying Party (RP) Services: can be any transaction including authorized login such as Web or on-premise log-in; Virtual Private Network (VPN) log-in; transaction monitoring; financial transaction for online or a point of sale (e.g., dollar amount, type of transaction including check versus wire versus cashier check); a workflow for approving; viewing or modifying data on a server; access to confidential versus restricted data, and/or physical access control to a secure facility or secure space. RP Services may typically be any web or on-premises service requiring approval for access with dynamic different levels of assurance within. RP Services can be an application (i.e., Relying Party (RP) Services Application) residing on a secure system/client device; be part of an RP Server; and/or be located at a separate server. In addition, an RP Service may be an application executing on a secure system/client device and connected to the RP Server(s) and/or located at a separate server, wherein the RP Server(s) and/or separate server provides the data and executables for providing the service through the application.
Risk Score: a risk score for a threat shall be determined by combining device data, user data and mobile/wearable device data. Various vectors/fingerprint data from the user are combined and converted to a risk score.
Threshold Risk Score: a set score set by system and method 100 to determine whether an entity seeking access is a threat.
User Entity: throughout this disclosure consumer, user, user entity, entity, machine entity, user agent, client, client agent, subscriber, requesting agent and requesting party shall mean the same and may be human or machine.
The abnormalities detection system and method 100 disclosed herein enables Relying Parties and/or operators of Risk and Analytics Engines 108 to build a predictive analytics engine delivering a dynamic level of assurance technique. Level of assurance (LOA) is a level of confidence for identity proofing for the binding between level of access for an entity and the presented identity information. Dynamic LOA is discussed in U.S. patent application Ser. No. 14/672,098, filed Mar. 18, 2015, having the title “Dynamic Authorization With Adaptive Levels of Assurance”, also assigned to Applicant, which is hereby incorporated by reference. The system and method 100 disclosed herein reduces difficulties in gaining access to secure systems and/or facilities (i.e., “friction”) for legitimate user entities, elevating trust in platform exchange and injecting preventive flows and measures when encountering potential bad/threat actors. The mobile component of the system and method 100 delivers an end-to-end solution feeding an analytics and early warning system. With connecting device, browser and user habits to the mobile contextual data (e.g., ItsMe™) it is possible to model user entity normal behavior and detect abnormalities.
The secure system/client device 106 may be another device upon which the user entity 102 is operating. The secure system/client device 106 may be any suitable electronic, computational, and/or communication device for conducting transactions, such as desktop computer, cash register, kiosk, order terminal, electronic lock, automobile lock, and/or any location or device on which a user entity 102 may seek access to a secure system/client device 106, an electronic location, a secure facility, user information, or other location or item having restricted access. As such, while such a secure system/client device 106 may have a user input device, such as a keyboard or keypad, the user input terminal would not necessarily have the capacity to display messages to a user entity 102 attempting to enter into a transaction. In certain embodiments, the user entity 102 may be an employee, for example, of a government agency, a pharmaceutical or health provider company, a financial institution, or an enterprise with privileged access to highly sensitive assets or data or a client of all above. In such instances, the user entities are pre-authorized and trusted with certain access permissions and credentials such as username and password to access the network or services.
Risk and Analytics Engine (or server) 108 may be operated by or for the benefit of an enterprise which may be any party that may offer a service or control access to a secure facility, secure system/client device 106 or something for which attempts to engage by a user entity 102 may need to be authorized or authenticated. (An exemplary operator of the Engine 108 may be Acceptto™ Corporation through its eGuardian™ service).
Backend Relying Party Server (or servers) 110 may provide a number of different account services, capabilities and information associated with an account or user entity 102. In one embodiment, the RP Server 110 may be owned or operated by a financial institution or any other entity that maintains sensitive data. User device 104 and secure system/client device 106 may have an RP Services Application 105 with various RP Services offered by such a Relying Party which are displayed on the user entity device 104 or secure system/client device 106 after login. The RP Services Application may be provided as a native application, an application running as a cloud service, a web browser interface or another suitable interface on devices 104 and 106. Each RP Service may be associated with a different level of assurance (LoA) of the identity of the user entity 102 accessing the RP Service. For an example, an RP Service such as viewing an account balance has the lowest LOA associated with it, and may be accessible as soon as the user entity 102 enters the login identification (ID) and password for a secure system and the information is verified by the Risk and Analytics Engine 108 and RP Server(s) 110. In contrast, other functions may require higher Level of Assurances (LOAs). For example, changing contact information (LOA N+1), making a payment (LOA N+2) or changing the password (LOA N+3) are RP Services that may require higher LOAs associated with the user entity 102 identity before allowing those functions to be completed where “N” is the base Level Of Assurance for system primary access.
The Risk and Analytics Engine 108 and RP Server 110 may, in various examples, be Machine to Machine Digital Key Authentication (M2M-DKA) servers and may utilize a secure communication protocol over network 111. The Risk and Analytics Engine 108 of abnormalities detection system and method 100 generally, may provide an integrated per-user contextual pattern detection for a secure facility operator or any Relying Party and the customers enabling transparency and detection of attacks and leakage of secure information.
In the illustrated example shown in
In the embodiment of
As shown in
As shown in
Reference item 244 indicates an analytical engine which is configured to receive input from the other sensors in the sensor hub 240 to monitor the user entity spatiotemporal and behavior patterns and habits to determine if the user entity 102 of either of the devices 104 and 106 is the correct entity. For example, habits might include environmental and/or behavioral patterns of the user entity 102 of the devices 104 and 106 such as the time the user entity 102 wakes up, arrives at the gym, and/or arrives at the secure facility and the like. Sensor 246 is used to measure gestures regarding how the user entity 102 handles the devices 104 and 106. For example, these gestures might include how the user entity 102 swipes the screen of the devices 104 and 106 with their finger including pressure, direction, right handed vs. left handed, and the like. In addition, sensor 246 may measure the electromagnetic signature of the operating environment of the devices 104 and 106 to determine if it fits a profile for the user entity 102. For example, the subscriber identification module (SIM) card and mobile identification of the devices 104 and 106 combined with the background electromagnetic factors may all be used in a verification process that the user entity 102 of the devices 104 and 106 is the correct entity. Reference item 248 measures an IP address being used by the devices 104 and 106 and may use a look up feature to verify the devices 104 and 106 are in a region typically occupied by the user entity 102. Camera 250 may be used for facial recognition of the user entity 102 and other biometric inputs such as a tattoo or the like. In addition, the camera 250 may be used to capture a background of the user entity 102 of the devices 104 and 106 to determine if it is an environment in which the user entity 102 oftentimes is found (e.g., a picture hanging behind the user entity 102 of the devices 104 and 106 may conform to a user entity 102 profile). Iris scanner 252 may be used to confirm through an eye scan the identity of the user entity 102. Reference item 254 indicates the devices 104 and 106 “unique identifications” which may be tied to a SIM card number and all associated unique signatures, an International Mobile Equipment Identification (IMEI) number, an Apple® identification, a telecommunications carrier (e.g., AT&T®, TMobile®, Vodafone®, Verizon®), battery serial number or the like. Ambient noise sensor 256 measures the noise levels surrounding the devices 104 and 106 including noises from nature and manmade noises (including communication equipment produced radio frequency noise). Ambient sensor 256 may also be able to measure a speaking voice to create a voiceprint to be able to verify that the user entity 102 is authentic. Reference item 258 is an application that measures the “wellness” of the user entity 102 of the devices 104 and 106 including heart rate, sleep habits, exercise frequency, and the like to gather information on the devices 104 and 106 and the user's lifestyle to contribute to verification decisions. Bus 260 couples the sensors and applications of the hub 240 to the cognitive engine 230.
Both user device 104 and secure system/client device 106 have a cognitive engine 230 used in verifying the identity of the user entity 102. In
The Risk and Analytics Engine 108 as shown in
Referring to
The data stored in the system data base 108m may contain personal identifier and sensitive private information that needs anonymization. These are tokenized and hashed in transit and at rest via the anonymization token engine 1081 as a function of Relying Party privacy rules and guidelines and regional laws all via AIML Configurable Policy Engine 108f. Third party data about user entities, devices, and transactions are made available via third party data APIs 108n enabling a cross company-industry data fusion which can provide black lists or white lists again via Policy Engine 108f.
The Engine 108 and RP Server 110 include a processor 118 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory 504, and a static memory 506, which are configured to communicate with each other via a bus 508. The Engine 108 and RP Server 110 may further include a graphics display 510 (e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)). The engine 108 and RP Server 110 may also include an alphanumeric input device 512 (e.g., a keyboard), a cursor control device 514 (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), a storage unit 516, a signal generation device 518 (e.g., a speaker), and network interface device 116.
The storage unit 516 includes a machine-readable medium 522 on which is stored the instructions 524 (e.g., software) embodying any one or more of the methodologies or functions for operation of the abnormality detection system and method 100 described herein. The instructions 524 may also reside, completely or at least partially, within the main memory 504, within processor 118 (e.g., within the processor's cache memory), or both, during execution thereof by the Engine 108 and RP Server 110. Accordingly, the main memory 504 and processor 118 may be considered as machine-readable media. The instructions 524 may be transmitted or received over a network 526 via network interface device 116.
As used herein, the term “memory” refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 522 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., software) for execution by a server (e.g., Engine 108 and RP Server 110), such that the instructions, when executed by one or more processors of the machine (e.g., processor 118), cause the machine to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory, an optical medium, a magnetic medium, or any suitable combination thereof.
Substantial variations may be made in accordance with specific requirements to the embodiments disclosed. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both.
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as an FPGA or an ASIC. A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software encompassed within a general-purpose processor or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations. Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time. Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented module” refers to a hardware module implemented using one or more processors.
Similarly, the methods described herein may be at least partially processor-implemented, a processor being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)).
The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
Some portions of this specification are presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). These algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or any suitable combination thereof), registers, or other machine components that receive, store, transmit, or display information. Furthermore, unless specifically stated otherwise, the terms “a” or “an” are herein used, as is common in patent documents, to include one or more than one instance. Finally, as used herein, the conjunction “or” refers to a non-exclusive “or,” unless specifically stated otherwise.
For online transactions, be it financial transactions, data mining, or simple logins, physical access control system, workflows, identity proofing the user entities (human or non-human) 102 need to identify themselves on-premises or remotely and reliably with a certain required degree of certainty. The desired level of assurance (LOA) of each associated transaction or session, for example at login or individually within a login session may vary and hence require real time response as a function of the associated LOA required for each transaction or each level of access within the session. For example, a login session for an online banking service (a typical example of a Relying Party) may require not only a credential-based trust model (e.g., simple user name and password) for general login and looking at balance history but may need a dynamic transactional-based model where additional factor(s) are required to transact a transfer or payment. Upon the initiation of a transaction with a higher LOA for the session login, the abnormalities detection system and method 100 may ask the user entity 102 for additional verification to authorize the transaction. Upon completion of a transaction the session credential will resume to baseline credential based at the lower LOA till the next transaction and associated LOA is presented or perhaps the session is terminated per policies orchestrated (e.g., time of inactivity). In this description, “time to live” is the pre-determined time that the attributes or verified credentials are valid for. Periodically, an LOA Server will perform a “refresh” to update at least some of the plurality of verified attributes and the verified credentials based on predetermined policies and on demand from the Risk and Analytics Engine 108 and RP Server 110. In the online banking example, to authorize the login and access to account balance versus a transaction, such as an online payment, the system and method 100 disclosed herein may require different LOAs with different types of multifactor authentication and out of band identity proofing such as using a combination of contextual information such as location, biometrics and digital fingerprint of a user-binded LOA Provider device, such as a smart phone or wearable with a unique set of attributes and capabilities.
The user entity 102 with a particular identity made by some entity can be trusted to actually be the claimant's “true” identity. Identity claims are made by presenting an identity credential to the Relying Party Server 110. In the case where the user entity 102 is a person, this credential may take several forms, including but not limited to: (a) personally identifiable information such as name, address, birthdate, etc.; (b) an identity proxy such a username, login identifier (user name), or email address; (c) biometric identifiers such as fingerprint or voice, face, etc.; (d) an X.509 digital certificate; and/or (e) a digital fingerprint and approval from a user-binded device.
Note that Identity Assurance specifically refers to the degree of certainty of an identity assertion made by an identity provider which may be machine, service or user by presenting an identity credential to the Relying Party. In order to issue this assertion, the identity provider must first determine whether or not the claimant requesting the transaction possesses and controls an appropriate token using a predefined authentication protocol. Depending on the outcome of this authentication procedure, the assertion returned to the Relying Party by the identity provider allows the Relying Party to decide whether or not to trust that the identity associated with the credential actually “belongs” to the person or entity presenting the requested credential.
Examples of data captured such as behavior patterns and attributes 606 may include the following. First, device (104 and 106) and browser fingerprints that uniquely identified a device, browser, network, habit of user on the device (104, 106) used for accessing compute and data and services. Devices 104 and 106 have footprints that may include browser (602 and/or 604) attributes such as screen size, screen resolution, font, language, and browser version. Second, central processing unit (CPU) and operating system changes not okay but browser upgrade may be okay. Third, user entity behavior and habits and inference of the user entity normal behavior, to identify risks associated with transactions. Fourth, trusted devices are devices that have been repeatedly authenticated over a period of time. The number of top trusted devices may limited to a number (e.g., 5). Fifth, a risk based authentication system that uses mobile device or other modalities of verification such as email, short message service (sms), voice, push, and voice call to promote locations, machines and time and type of transactions to trusted events/habits of user entities 102. The system and method 100 allows for calculating individual transaction risk based on contextual factors such as user behavior, device, browser and the network traffic and request for authentication by account owner when risk greater than allowed threshold. Sixth, a PC desktop that has not been used for a long period of time (e.g., days or weeks) will be dropped from a trusted device list. Seventh, location which may be found by Internet Protocol (IP) reverse lookup of Internet Service Provider (ISP). Eighth, user entity 102 behavioral footprint on desktop PC (client device 106) such as speed of user entity 102 typing, number of hours and time intervals user is on this device (e.g., iMac® at home is usually used in evenings and weekends; use of touch screen feature). Ninth, user entity 102 behavior footprint might also include: time of use, location of use; hardware (including auxiliary devices such as type of keyboards, mouse, and user behavior on both); browser specific data such as browser updates and changes (i.e., heuristics), browser type, browser version, plug-in and applications; brand and type of CPU, operating system; browser user configuration such as fonts (e.g., expected fonts versus user configured fonts), language and the like; Canvas financial planning, type of display, screen resolution; and/or time zone, internet protocol (IP) address, geographic location. Tenth, code in the browser (e.g., JavaScript code) and/or installed on the device (104, 106) executing on the computer collects data from the desktop 106 may be used. Eleventh, with regard to the mobile device 104 footprint it may include subscriber identity module (SIM), international mobile equipment identity (IMEI), applications on the device, and/or secret keys. Twelfth, with regard to the mobile device 102 it be derived behavior footprint such as location, habits, walking gait, exercise, how any times you call your top contacts (e.g., top 5 contacts). Thirteenth, the sequence of events and derived context of normal versus abnormal may also be considered.
In step 800-1, user entity 102 attempts to connect to RP Server 110 but is also connected to Risk and Analytics Engine 108 using the network 111. Risk and Analytics Engine 108 uses inferences on abnormalities to measure transaction risk for both forensics and injected required level of authentication. Upon first transaction and engagement by a user device 104 in step 800-1, the system and method 100 determines if the device 104 has been seen before (step 800-2) or is a suspect device. In step 800-3, a test and a User Browser Traffic Device (UBTD) search & match is performed in UBTD Search and Match Engine 108j to determine if the device (104 and/or 106) has been seen before and is a suspect machine. If yes, in step 800-4, the system proceeds to UBTD Advanced Tracking which will be discussed in detail below. If no, in step 800-5, a browser and device inference is made in which behavior information, browser and device (104, 106) attributes are collected by the Engine 108. There is conducted a profiling process to determine if a first time visitor is either clean, normal device and browser or if it is not. In the profiling process there is captured a significant number of device 104 and browser 604 data points that can uniquely identify the user entity 102 and/or user device 104. There is also captured a set of user behavior unique characteristics derived from both mobile authenticators and when provisioned and allowed (a user option). As the visitor traverses the site the system and method capture and identify if they need to be on the watch list.
In step 800-6 as shown in
In step 800-7, the access is profiled and a risk score is calculated. Based on the behavioral data, the device 104 and browser 604 fingerprint and use of mobile device (or not) authentication of transactions the risk engine generates risk score. The risk score can be generated locally on a mobile device 104 or, alternatively, by sending all the information to Engine 108. Next, a behavioral confidence score is generated that the user entity 102 of the mobile device 104 (or secure system/client device 106) is in fact the legitimate user, and not someone else. This behavioral confidence score may be paired with a behavioral fingerprint generated for the user 102 of secure system/client device 106. The user entity 102 establishes the fingerprint profile of the device 104. Based on the behavioral data as discussed above, the browser and device finger print and use of mobile device (or not) authentication of transactions a risk engine generates a risk score (either locally on mobile device or by sending all the information to backend server) and a behavioral confidence score is generated that the user of the mobile device, the user entity 102 using the secure system/client device 106 is in fact the legitimate user, and not someone else. This behavioral confidence score may be paired with a behavioral fingerprint generated for the user of secure system/client device 106.
In step 800-8, a test is made to see if the profile and risk score are greater than the predetermined risk threshold. If yes, in step 800-9, it is suspected or known that the device (104 and/or 106) is a bad device and proceeds to UBTD Advanced Tracking (step 800-4) to be added to the list of suspect devices. The UBDT is the central data base tracking all white listed, blacklisted, at risk, suspect devices, browsers, transactions, mobile devices, and habits of the users (good actors vs threat actors) and allows consequent accesses to be searched and matched against historical events associated with a user, device, browser, and the like and fetched data all feeds to Core AIML Risk Engine 108a having artificial intelligence and machine learning for per transaction risk versus friction evaluation.
If no, in step 800-10, the user entity 102 logs in with the measure risk score. In step 800-11, the user entity 102 is authenticated and flow authorized. If the user entity 102 has a mobile application (e.g., mobile eGuardian® app) with various plugins such as biobehavioral, multi-factor authentication, health, and/or Software Development Kits then the system and method 100 deals with the dynamic level of assurance (LOA) and risk score that comes from the mobile device 104 as part of the multi modal with the right confidence score that feeds into the risk threshold calculation on the backend of the Risk and Analytics Engine 108. In cases that the user entity 102 uses a well-known browser (e.g., Chrome®) and there is a legitimate matching with the Internet Protocol (IP) that is tied to a trusted device, location, and/or habit (e.g., user entity 102 home address or work address including a virtual private network (VPN) from legitimate corporation) then all is marked as good from the authentication perspective. However, in cases that the user entity 102 uses tools to stay anonymous such as TOR this user entity 102 is captured and may be designated a nutty user that evidently has the credentials and the out of band (OOB) mobile device 104 and can comply with the dynamic LOA engine 108d. The system and method 100 can apply a higher LOA and for example mandate a push plus biometric authentication when TOR or other anonymous tools are used instead of time base one time password (TOTP) via SMS, voice, email or TOTP apps/services.
If no mobile platform is available to multifactor authenticate (MFA) via a trusted device and whether the user entity 102 fails the authentication before the authorization, the system and method 100 can then enforce other OOB modalities (as outlined in
Note that for every event-trigger the Engine database 108m is updated and scores are tracked and machine learning may be used to draw a graph of normal versus abnormal for the user entity 102 habits, transactions, approvals, and/or declines. This overall backend history is used by an artificial intelligence machine learning process to drive towards perfecting the predictive analytics at individual or at group level of user entities.
Note that a mobile application (e.g., the eGuardian® mobile app) may have a significant number of contextual data from device 104 location, IP address, fourth generation of cellular broadband network technology (4G) data, history, current location and habits via the health application data, walking gait, and bunch of heuristics which are acquired all the time as user entity 102 uses the mobile device 104 throughout the day. In one embodiment when the legitimate users happen to have employed the mobile application then detecting abnormalities and escalating LOA dynamically the ultimate control is in the hands of the legitimate user 102 leveraging the mobile device data and the behavioral derived confidence score is used. For the user entities 102 that they do not have the mobile application, abnormalities can still be detected and force an out of band (OOB) (e.g., email or text/voice) demanding a time based TOTP token. The imposed friction then only taxes threat actors that try to penetrate the Relying Party Server 110 acting as legitimate users with pool of valid usernames and passwords.
In step 800-11, the system 100 proceeds to a decision point whether to add a supercookie (S-Cookie) which is a hard unique ID-token stored on the device, browser or both as the unique identifier of a trusted/issued, registered and authenticated or an approved bring your own device (BYOD). Injecting a supercookie can be used to better classify and serve the legitimate user entities 102 and protect the enterprise from threat actors. If yes, the system 100 proceeds to step 800-9 for login and step 800-10 to authenticate and authorize flow. The user entity 102 may be prompted for a higher level of assurance (LOA) and out of band authentication (as illustrated in
Returning to step 900-1, if the user entity 102 does not enter the correct password, the user entity 102 is given several chances to either request a new password or enter a correct password in steps 900-6 and 900-7. In step 900-8, the mobile authorization for the new password is requested. In step 900-9, the out of band authorization is approved by user entity 102. In step 900-10, reset of password by short message service (SMS), email, knowledge based authentication (kba) or the like. The sequence of events, device information and authentication and authorization of all transactions are all stored in the advanced tracking data base in Engine 108 which is a high speed data base used for fast search and match of future transactions from white listed or blacklisted devices, IPs, transactions, locations etc.
Below are samples of ground truth test tables to exemplify different scenarios which can uniquely identify the user entity 102 and their device (or devices) 104 and 106:
The foregoing has outlined rather broadly features and technical advantages of examples in order that the detailed description that follows can be better understood. The foregoing embodiments are presently by way of example only; the scope of the present disclosure is to be limited only by the claims. Various embodiments may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods described may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain embodiments may be combined in various other embodiments. Different aspects and elements of the embodiments may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples that do not limit the scope of the disclosure to those specific examples. Additional features and advantages will be described hereinafter. The conception and specific examples disclosed can be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Such equivalent constructions do not depart from the spirit and scope of the appended claims. Each of the figures is provided for the purpose of illustration and description only and not as a definition of the limits of the claims. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention
Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
Specific details are given in the description to provide a thorough understanding of the embodiments. However, embodiments may be practiced without these specific details. For example, well-known processes, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the embodiments. This description provides example embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the preceding description of the embodiments will provide those skilled in the art with an enabling description for implementing embodiments of the disclosure. Various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the disclosure.
Although process (or method) steps may be described or claimed in a particular sequential order, such processes may be configured to work in different orders. In other words, any sequence or order of steps that may be explicitly described or claimed does not necessarily indicate a requirement that the steps be performed in that order unless specifically indicated. Moreover, the illustration of a process by its depiction in a drawing does not imply that the illustrated process is exclusive of other variations and modifications thereto, does not necessarily imply that the illustrated process or any of its steps are necessary to the embodiment(s), and does not imply that the illustrated process is preferred.
To aid the Patent Office and any readers of any patent issued on this application in interpreting the claims appended hereto, applicants wish to note that they do not intend any of the appended claims or claim elements to invoke 35 U.S.C. 112(f) unless the words “means for” or “step for” are explicitly used in the particular claim.
The definitions of the words or elements of the claims shall include not only the combination of elements which are literally set forth, but all equivalent structure, material or acts for performing substantially the same function in substantially the same way to obtain substantially the same result.
Neither the Title (set forth at the beginning of the first page of the present application) nor the Abstract (set forth at the end of the present application) is to be taken as limiting in any way as the scope of the disclosed invention(s). The title of the present application and headings of sections provided in the present application are for convenience only, and are not to be taken as limiting the disclosure in any way.
Devices that are described as in “communication” with each other or “coupled” to each other need not be in continuous communication with each other or in direct physical contact, unless expressly specified otherwise. On the contrary, such devices need only transmit to each other as necessary or desirable, and may actually refrain from exchanging data most of the time. For example, a machine in communication with or coupled with another machine via the Internet may not transmit data to the other machine for long period of time (e.g. weeks at a time). In addition, devices that are in communication with or coupled with each other may communicate directly or indirectly through one or more intermediaries.
It should be noted that the recitation of ranges of values in this disclosure are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. Therefore, any given numerical range shall include whole and fractions of numbers within the range. For example, the range “1 to 10” shall be interpreted to specifically include whole numbers between 1 and 10 (e.g., 1, 2, 3, . . . 9) and non-whole numbers (e.g., 1.1, 1.2, . . . 1.9).
This application is a Continuation of U.S. patent application Ser. No. 17/140,017, filed on Jan. 1, 2021; which claims priority to Continuation-In-Part of U.S. patent application Ser. No. 16/298,990, filed on Mar. 11, 2019; which claims priority to U.S. Patent Provisional Application No. 62/641,362, filed Mar. 11, 2018; all of these applications are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
2146842 | Niessen | Feb 1939 | A |
5903882 | Say | May 1999 | A |
6100885 | Donnelly | Aug 2000 | A |
6636721 | Threadgill | Oct 2003 | B2 |
6850497 | Sigier | Feb 2005 | B1 |
7243369 | Bhat | Jul 2007 | B2 |
7260734 | Dickinson | Aug 2007 | B2 |
7395435 | Benhammou | Jul 2008 | B2 |
7584152 | Gupta | Sep 2009 | B2 |
7721322 | Sastry | May 2010 | B2 |
7962419 | Gupta | Jun 2011 | B2 |
7971062 | Hughes et al. | Jun 2011 | B1 |
8127142 | Cuppett | Feb 2012 | B2 |
8127982 | Casey | Mar 2012 | B1 |
8191106 | Choyi | May 2012 | B2 |
8205249 | Meister et al. | Jun 2012 | B2 |
8244629 | Lewis | Aug 2012 | B2 |
8261089 | Cobos et al. | Sep 2012 | B2 |
8327142 | Lund | Dec 2012 | B2 |
8346924 | Bucher et al. | Jan 2013 | B1 |
8423476 | Bishop | Apr 2013 | B2 |
8429757 | Cavage | Apr 2013 | B1 |
8457781 | Bailey | May 2013 | B2 |
8510797 | Kasturi | Aug 2013 | B2 |
8510811 | Kuang et al. | Aug 2013 | B2 |
8510816 | Quach | Aug 2013 | B2 |
8516542 | Lerner | Aug 2013 | B2 |
8572714 | Radhakrishnan | Oct 2013 | B2 |
8612357 | Phillips | Dec 2013 | B2 |
8613067 | Lambiase | Dec 2013 | B2 |
8615562 | Huang | Dec 2013 | B1 |
8661254 | Sama | Feb 2014 | B1 |
8700901 | Lund | Apr 2014 | B2 |
8707031 | Grajek | Apr 2014 | B2 |
8756661 | Levenberg | Jun 2014 | B2 |
8769651 | Grajek | Jul 2014 | B2 |
8776204 | Faynberg et al. | Jul 2014 | B2 |
8812838 | Grajek | Aug 2014 | B2 |
8831677 | Villa-Real | Sep 2014 | B2 |
8856894 | Dean | Oct 2014 | B1 |
8869241 | Davis | Oct 2014 | B2 |
8904494 | Kindler | Dec 2014 | B2 |
9077758 | McGovern | Jul 2015 | B1 |
9124576 | Grajek | Sep 2015 | B2 |
9288195 | Lambiase | Mar 2016 | B2 |
9288217 | Kirkham | Mar 2016 | B2 |
9338155 | Quach | May 2016 | B2 |
9369457 | Grajek | Jun 2016 | B2 |
9374369 | Mahaffey | Jun 2016 | B2 |
9391779 | Bair | Jul 2016 | B2 |
9419951 | Feisher | Aug 2016 | B1 |
9420002 | McGovern | Aug 2016 | B1 |
9426183 | Shahidzadeh | Aug 2016 | B2 |
9444824 | Balazs | Sep 2016 | B1 |
9473310 | Grajek | Oct 2016 | B2 |
9510320 | Reed | Nov 2016 | B2 |
9516053 | Muddu | Dec 2016 | B1 |
9613257 | Phillips | Apr 2017 | B2 |
9660974 | Grajek | May 2017 | B2 |
9674205 | Kirkham | Jun 2017 | B2 |
9736145 | Hayes | Aug 2017 | B1 |
9742809 | Shahidzadeh | Aug 2017 | B1 |
9756035 | Grajek | Sep 2017 | B2 |
9769209 | Graham | Sep 2017 | B1 |
9781097 | Grajek | Oct 2017 | B2 |
9882728 | Grajek | Jan 2018 | B2 |
9900163 | Lund | Feb 2018 | B2 |
9930040 | Quach | Mar 2018 | B2 |
9985969 | Cavage | May 2018 | B1 |
10148699 | Shahidzadeh | Dec 2018 | B1 |
10193897 | Kirkham | Jan 2019 | B2 |
10223696 | Chen | Mar 2019 | B2 |
10270748 | Briceno | Apr 2019 | B2 |
10325259 | Shahidzadeh | Jun 2019 | B1 |
10366378 | Han | Jul 2019 | B1 |
10367826 | Kirkham | Jul 2019 | B2 |
10387980 | Shahidzadeh | Aug 2019 | B1 |
10404678 | Grajek | Sep 2019 | B2 |
10419418 | Grajek | Sep 2019 | B2 |
10439826 | Grajek | Oct 2019 | B2 |
10498605 | Weith | Dec 2019 | B2 |
10567385 | Quach | Feb 2020 | B2 |
10567402 | Comeaux | Feb 2020 | B1 |
10572874 | Shahidzadeh | Feb 2020 | B1 |
10572884 | Comeaux | Feb 2020 | B1 |
10637853 | Lindemann | Apr 2020 | B2 |
10693661 | Hamlet | Jun 2020 | B1 |
10715555 | Shahidzadeh | Jul 2020 | B1 |
10749887 | Hawthorn | Aug 2020 | B2 |
10805315 | Kirkham | Oct 2020 | B2 |
10812503 | Comeaux | Oct 2020 | B1 |
10824702 | Shahidzadeh | Nov 2020 | B1 |
10834104 | Comeaux | Nov 2020 | B1 |
10904263 | Zu | Jan 2021 | B2 |
10922631 | Shahidzadeh | Feb 2021 | B1 |
10943230 | Chamberot | Mar 2021 | B2 |
10949858 | White | Mar 2021 | B2 |
10951606 | Shahidzadeh | Mar 2021 | B1 |
10992692 | Comeaux | Apr 2021 | B1 |
11005839 | Shahidzadeh | May 2021 | B1 |
11005862 | Comeaux | May 2021 | B1 |
11037160 | Kolls | Jun 2021 | B1 |
11096059 | Shahidzadeh | Aug 2021 | B1 |
11101993 | Shahidzadeh | Aug 2021 | B1 |
11133929 | Shahidzadeh | Sep 2021 | B1 |
11250530 | Shahidzadeh | Feb 2022 | B1 |
11252573 | Shahidzadeh | Feb 2022 | B1 |
11310261 | Hawthorn | Apr 2022 | B2 |
11321712 | Shahidzadeh | May 2022 | B1 |
11334932 | Sarin | May 2022 | B2 |
11657396 | Shahidzadeh | May 2023 | B1 |
20030061111 | Dutta | Mar 2003 | A1 |
20030115132 | Brickell | Jun 2003 | A1 |
20040155101 | Royer | Aug 2004 | A1 |
20050060584 | Ginler | Mar 2005 | A1 |
20050097320 | Golan | May 2005 | A1 |
20050102530 | Burrows | May 2005 | A1 |
20060282660 | Varghese | Dec 2006 | A1 |
20070011066 | Steeves | Jan 2007 | A1 |
20070033136 | Hu | Feb 2007 | A1 |
20070118891 | Buer | May 2007 | A1 |
20070156611 | Gupta | Jul 2007 | A1 |
20070262136 | Ou | Nov 2007 | A1 |
20080101283 | Calhoun | May 2008 | A1 |
20080196088 | Vinokurov | Aug 2008 | A1 |
20080222283 | Ertugral | Sep 2008 | A1 |
20090077163 | Ertugral | Mar 2009 | A1 |
20090089869 | Varghese | Apr 2009 | A1 |
20090097661 | Orsini | Apr 2009 | A1 |
20090132808 | Baentsch et al. | May 2009 | A1 |
20090259838 | Lin | Oct 2009 | A1 |
20090271847 | Karjala | Oct 2009 | A1 |
20090292927 | Wenzel | Nov 2009 | A1 |
20090307135 | Gupta | Dec 2009 | A1 |
20100228996 | Ginter et al. | Sep 2010 | A1 |
20100325710 | Etchegoyen | Dec 2010 | A1 |
20100327056 | Yoshikawa | Dec 2010 | A1 |
20110035788 | White et al. | Feb 2011 | A1 |
20110047044 | Wright | Feb 2011 | A1 |
20110086612 | Montz et al. | Apr 2011 | A1 |
20110093927 | Leppanen | Apr 2011 | A1 |
20110162053 | Pei | Jun 2011 | A1 |
20110173017 | Salonen | Jul 2011 | A1 |
20110173448 | Baentsch et al. | Jul 2011 | A1 |
20110204142 | Rao | Aug 2011 | A1 |
20110209200 | White | Aug 2011 | A2 |
20110244798 | Daigle | Oct 2011 | A1 |
20110276468 | Lewis | Nov 2011 | A1 |
20110288996 | Kreutz | Nov 2011 | A1 |
20110296513 | Kasad | Dec 2011 | A1 |
20110307949 | Ronda et al. | Dec 2011 | A1 |
20120117157 | Ristock | May 2012 | A1 |
20120159177 | Bajaj | Jun 2012 | A1 |
20120185547 | Hugg | Jul 2012 | A1 |
20120192260 | Kontsevich | Jul 2012 | A1 |
20120330788 | Hanson | Dec 2012 | A1 |
20130007849 | Coulter | Jan 2013 | A1 |
20130047202 | Radhakrishnan | Feb 2013 | A1 |
20130047213 | Radhakrishnan et al. | Feb 2013 | A1 |
20130111549 | Sowatskey | May 2013 | A1 |
20130144888 | Faith | Jun 2013 | A1 |
20130166323 | Heath | Jun 2013 | A1 |
20130185205 | Boss et al. | Jul 2013 | A1 |
20130204708 | Ramachandran | Aug 2013 | A1 |
20130205133 | Hess | Aug 2013 | A1 |
20130276125 | Bailey | Oct 2013 | A1 |
20130298242 | Kumar et al. | Nov 2013 | A1 |
20130305322 | Raleigh | Nov 2013 | A1 |
20140040975 | Raleigh | Feb 2014 | A1 |
20140164218 | Stewart | Jun 2014 | A1 |
20140167917 | Wallace | Jun 2014 | A2 |
20140189808 | Gupta | Jul 2014 | A1 |
20140189809 | Koved et al. | Jul 2014 | A1 |
20140189840 | Metke et al. | Jul 2014 | A1 |
20140247155 | Proud | Sep 2014 | A1 |
20140289833 | Briceno | Sep 2014 | A1 |
20140304795 | Bruno et al. | Oct 2014 | A1 |
20150058931 | Miu | Feb 2015 | A1 |
20150121462 | Courage | Apr 2015 | A1 |
20150156208 | Kirkham | Jun 2015 | A1 |
20150310444 | Chen | Oct 2015 | A1 |
20160055690 | Raina | Feb 2016 | A1 |
20160164885 | Kirkham | Jun 2016 | A1 |
20160189150 | Ahuja | Jun 2016 | A1 |
20160241532 | Loughlin-Mchugh | Aug 2016 | A1 |
20170024531 | Malaviya | Jan 2017 | A1 |
20170032113 | Tunnell | Feb 2017 | A1 |
20170244728 | Kirkham | Aug 2017 | A1 |
20170244746 | Hawthorn | Aug 2017 | A1 |
20170339176 | Backer et al. | Nov 2017 | A1 |
20170357917 | Holmes | Dec 2017 | A1 |
20180012003 | Asulin | Jan 2018 | A1 |
20180041503 | Lindemann | Feb 2018 | A1 |
20180101676 | Bailey | Apr 2018 | A1 |
20180108003 | Todasco | Apr 2018 | A1 |
20180189782 | Chamberot | Jul 2018 | A1 |
20180276572 | Otillar | Sep 2018 | A1 |
20180316657 | Hardt et al. | Nov 2018 | A1 |
20180316661 | Teixeron | Nov 2018 | A1 |
20190028803 | Benattar | Jan 2019 | A1 |
20190073676 | Wang | Mar 2019 | A1 |
20190108334 | Sadaghiani | Apr 2019 | A1 |
20190110158 | Schwartz | Apr 2019 | A1 |
20190132333 | Kirkham | May 2019 | A1 |
20190156345 | Chen | May 2019 | A1 |
20190188723 | Wang | Jun 2019 | A1 |
20190253404 | Briceno | Aug 2019 | A1 |
20190281036 | Eisen | Sep 2019 | A1 |
20190297091 | Kirkham | Sep 2019 | A1 |
20190313967 | Lee | Oct 2019 | A1 |
20190378394 | Kawese | Dec 2019 | A1 |
20200021603 | Zu | Jan 2020 | A1 |
20200042723 | Krishnamoorthy | Feb 2020 | A1 |
20200043118 | Sakaguchi | Feb 2020 | A1 |
20200137038 | Endler | Apr 2020 | A1 |
20200175434 | Wisniewski | Jun 2020 | A1 |
20200184480 | Wang | Jun 2020 | A1 |
20200242222 | Machani et al. | Jul 2020 | A1 |
20200294680 | Gupta | Sep 2020 | A1 |
20200349247 | Seo | Nov 2020 | A1 |
20200381130 | Edwards | Dec 2020 | A1 |
20210021632 | Hawthorn | Jan 2021 | A1 |
20210133759 | Leddy | May 2021 | A1 |
20210168156 | Zu | Jun 2021 | A1 |
20210176066 | Keith | Jun 2021 | A1 |
20210195411 | Ratnakaram | Jun 2021 | A1 |
20220210181 | Hawthorn | Jun 2022 | A1 |
Number | Date | Country |
---|---|---|
3528153 | Aug 2019 | EP |
Entry |
---|
Khanna, Tarun. “Contextual Intelligence”, Harvard Business Review, Sep. 2014. |
Smart, M.B., “Improving Remote Identity Authentication For Consumers and Financial Institutions” Order No. 10245677, ProQuest, 2016. |
Number | Date | Country | |
---|---|---|---|
62641362 | Mar 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17140017 | Jan 2021 | US |
Child | 17952864 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16298990 | Mar 2019 | US |
Child | 17140017 | US |