System to identify vulnerable card readers

Information

  • Patent Grant
  • 11468450
  • Patent Number
    11,468,450
  • Date Filed
    Wednesday, August 26, 2020
    3 years ago
  • Date Issued
    Tuesday, October 11, 2022
    a year ago
Abstract
Example embodiments relate to a network-based vulnerability detection system configured to access a database of customer transaction data corresponding to a set of card readers that includes transaction codes, receive an identification of a set of compromised card readers among the set of card readers, identify common transaction codes within the transaction data of the set of compromised card readers, and correlate the common transaction codes to one or more instances of fraud associated with the compromised set of card readers. In some example embodiments, the vulnerability detection system may be applied to monitor one or more card readers, receive transaction data corresponding to transaction conducted through the card readers, identify the common transaction codes correlated to the instances of fraud, and cause display of a notification that includes an indication of the instance of fraud at a client device.
Description
TECHNICAL FIELD

The present disclosure generally relates to the technical field of special-purpose machines to detect fraudulent activity, and more particularly to payment card fraud detection in card readers.


BACKGROUND

Credit card skimmers are devices placed on top of actual credit card readers and copy information from magnetic strips of credit cards. While some credit card readers may be developed to make the installation of credit card skimmers difficult, it is currently impossible to identify the existence of a credit card skimmer based on transaction data collected from the credit card reader alone.





BRIEF DESCRIPTION OF THE DRAWINGS

Various ones of the appended drawings merely illustrate example embodiments of the present disclosure and are not intended to limit its scope to the illustrated embodiments. On the contrary, these examples are intended to cover alternatives, modifications, and equivalents as may be included within the scope of the disclosure.



FIG. 1 is a network diagram depicting a network system comprising a group of application servers in communication with a network-based vulnerability detection system configured to detect instances of fraud, consistent with some embodiments.



FIG. 2 is a block diagram illustrating various components of the vulnerability detection system, which is provided as part of the network system, consistent with some embodiments.



FIG. 3 is a flowchart illustrating a method for correlating existing transaction codes with an instance of fraud, consistent with example embodiments.



FIG. 4 is a flowchart illustrating a method for identifying an instance of fraud based on existing transaction codes, consistent with some example embodiments.



FIG. 5 is a flowchart illustrating a method for calculating a vulnerability score of a card reader based on transaction data, consistent with some embodiments.



FIG. 6 is a diagram illustrating a potentially vulnerable card reader, consistent with some embodiments.



FIG. 7 is a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.





DETAILED DESCRIPTION

Reference will now be made in detail to specific example embodiments for carrying out the inventive subject matter of the present disclosure. In the following description, specific details are set forth in order to provide a thorough understanding of the subject matter. It shall be appreciated that embodiments may be practiced without some or all of these specific details.


Example embodiments relate to a network-based vulnerability detection system configured to access a database of customer transaction data corresponding to a set of card readers. The customer transaction data may, for example include transaction codes indicating a status and type of each transaction conducted through the set of card readers. The network-based vulnerability detection system may receive an identification of a set of compromised card readers among the set of card readers, identify common transaction codes within the transaction data of the set of compromised card readers, and correlate the common transaction codes to one or more instances of fraud associated with the compromised set of card readers. In some example embodiments, the vulnerability detection system may be applied to monitor one or more card readers, receive transaction data corresponding to a transaction conducted through the card readers, identify the common transaction codes correlated to the instances of fraud, and cause display of a notification that includes an indication of the instance of fraud at a client device.


A “card reader” is an input device that reads data from card-shaped storage mediums. For example, a card reader may read magnetic strip cards (e.g., credit cards), barcodes, proximity cards (e.g., 26-but Wiegand format), Chip Authentication Program (CAP) cards, as well as smart cards that include embedded microprocessor and memory. Card readers may be used in retail locations to allow an individual to provide payment through the card reader to facilitate a transaction. Card readers are often configured to log transaction data associated with transaction conducted through a card reader, and in some instances may identify and log errors in the transactions as well. For example, a card reader may access a database of “transaction codes” that indicate a status or event associated with a transaction. The database may include a set of pre-configured transaction codes to indicate, among other things: a successful transaction; an incomplete or faulty card swipe; or a declined card. The vulnerability detection system may therefore be configured to access such logs and databases, and correlate particular transaction codes with specific instances of fraud, not previously detectable through a review of transaction codes due to a lack of relevant, specific transaction codes.


The vulnerability detection system may be further configured to monitor transaction data from one or more card readers (e.g., within a retail location), identify a transaction code correlated with an instance of fraud within the transaction data, and in response to the identifying the correlated transaction code, signal a client device to cause display of a notification that includes an indication of the instance of wrong doing. The transaction data monitored may include card reader identifiers, time stamps, transaction codes, as well as card identifiers of cards associated with the corresponding transactions. Upon identifying a transaction code correlated with a particular instance of fraud (e.g., a card skimmer), the vulnerability detection system may cause display of a notification at a client device that includes a presentation of the relevant card reader identifiers associated with the transaction code, the transaction code, time stamp data, and indication of the instance of fraud, as well as other relevant transaction information.


In some example embodiments, the vulnerability detection system monitors the one or more card readers over a period of time, and determines a frequency or rate in which transaction codes correlated with instances of fraud are identified. The period of time may be defined based on a specific time period (e.g., Monday, or Monday from 9:00 am to 12:00 pm), or may be based on a number of transactions conducted through a particular card reader (e.g., fifty transactions), or on a number of card swipe attempts (e.g., fifty card swipes). The vulnerability detection system collects the transaction data over the specified time period and determines a rate, a number, and/or a frequency in which a transaction code correlated to an instance of fraud appears. Based on the rate, number, and/or frequency, the vulnerability detection system calculates a “vulnerability score” of each card reader.


In some example embodiments, the vulnerability detection system may also factor a card reader type into the calculation of the vulnerability score. Card readers may include self-checkout card readers (e.g., card readers that are not managed by an in-person attendant), as well as managed card readers (card readers that are managed by an in person attendant). The vulnerability detection system may access transaction data received from both managed card readers and self-checkout card readers and determine an expected rate, number, and frequency of various transaction codes that appear over a period of time. Having determined an expected rate for both the self-checkout card readers and the managed card readers, the vulnerability detection system may weight a vulnerability score given to a particular card reader based on a normalization function calculated based on the expected rate, number, and frequency.


In some example embodiments, the vulnerability detection system may cause display of the notification upon detecting a transgression of a vulnerability threshold by a vulnerability score. The vulnerability threshold may be defined by a user (e.g., flag all card readers with a vulnerability score above a certain value), or in some embodiments may be determined based on a normalization function and/or a historical time series of vulnerability scores across physical locations (e.g., retail locations). For example, transaction data over a period of time may be reviewed and analyzed in order to determine an occurrence rate of each transaction code among the set of transaction codes. Upon confirming an expected occurrence rate of a transaction code, the vulnerability detection system may calculate a vulnerability threshold.


In some example embodiments, the vulnerability detection system is further configured to disable a card reader in response to identifying a transaction code correlated to an instance of fraud, and/or detecting that a vulnerability score has transgressed the vulnerability threshold. For example, the instance of fraud may include the existence of a card skimmer on the card reader. Upon identifying the correlated transaction code, the vulnerability detection system may cause the card reader to decline further transaction requests, shut off, or otherwise indicate that the card reader is compromised.



FIG. 1 is a network diagram illustrating a network environment 100 suitable for operating a vulnerability detection system 150. A networked system 102 provides server-side functionality, via a network 104 (e.g., an intranet, the Internet or a Wide Area Network (WAN)), to one or more clients such as the client device(s) 110 and server 130. FIG. 1 illustrates a web client 112 and client application(s) 114 executing on respective client device(s) 110.


An Application Program Interface (API) server 120 and a web server 122 are coupled to, and provide programmatic and web interfaces respectively to, one or more application server(s) 140. The application servers 140 host the vulnerability detection system 150. The application server(s) 140 are, in turn, shown to be coupled to one or more database servers 124 that facilitate access to one or more databases 126.


The vulnerability detection system 150 identifies correlations between preconfigured transaction codes and instances of fraud. For example, the vulnerability detection system 150 is configured to access the databases 126 to retrieve transaction data collected from a set of card readers, identify a set of compromised card readers among the set of card readers, and correlate common transaction codes between the compromised card readers with an instance of fraud.


As shown, the network environment 100 includes the client device(s) 110 in communication with the networked system 102 over the network 104. The networked system 102 communicates and exchanges data with the client device(s) 110 that pertains to various functions and aspects associated with the networked system 102 and its users. Likewise, the client device(s) 110, which may be any of a variety of types of devices that include at least a display, a processor, and communication capabilities that provide access to the network 104 (e.g., a smart phone, a tablet computer, a personal digital assistant (PDA), a personal navigation device (PND), a handheld computer, a desktop computer, a laptop or netbook, or a wearable computing device), may be operated by a user (e.g., a person) of the networked system 102 to exchange data with the networked system 102 over the network 104.


The client device(s) 110 communicates with the network 104 via a wired or wireless connection. For example, one or more portions of the network 104 may comprises an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (WLAN), a Wide Area Network (WAN), a wireless WAN (WWAN), a Metropolitan Area Network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a Wireless Fidelity (Wi-Fi®) network, a Worldwide Interoperability for Microwave Access (WiMax) network, another type of network, or any suitable combination thereof.


In various embodiments, the data exchanged between the client device(s) 110 and the networked system 102 may involve user-selected functions available through one or more user interfaces (UIs). The UIs may be specifically associated with the web client 112 (e.g., a browser) or the client application 114, executing on the client device(s) 110, and in communication with the networked system 102.



FIG. 2 is a block diagram illustrating various components of the vulnerability detection system 150, which is provided as part of the networked system 102, consistent with some embodiments. To avoid obscuring the inventive subject matter with unnecessary detail, various functional components (e.g., modules and engines) that are not germane to conveying an understanding of the inventive subject matter have been omitted from FIG. 2. However, a skilled artisan will readily recognize that various additional functional components may be supported by the vulnerability detection system 150 to facilitate additional functionality that is not specifically described herein.


As is understood by skilled artisans in the relevant computer arts, each functional component (e.g., module) illustrated in FIG. 2 may be implemented using hardware (e.g., a processor of a machine) or a combination of logic (e.g., executable software instructions) and hardware (e.g., memory and processor of a machine) for executing the logic. Furthermore, the various functional components depicted in FIG. 2 may reside on a single computer (e.g., a laptop), or may be distributed across several computers in various arrangements such as cloud-based architectures. Moreover, any two or more modules of the vulnerability detection system 150 may be combined into a single module, or subdivided among multiple modules. It shall be appreciated that while the functional components (e.g., modules) of FIG. 2 are discussed in the singular sense, in other embodiments, multiple instances of one or more of the modules may be employed.


The communication module 210 provides functionality to communicate with client devices (e.g., client devices 110), data source 130 (e.g., a card reader), and databases 126 in order to access transaction data, cause display of notifications, or signal card readers. The transaction data may for example include card reader identifiers, transaction amounts, time stamps, transaction codes, as well as card identifiers of cards associated with the corresponding transactions.


The identification module 220 provides functionality to identify common transaction codes between compromised card readers among a set of card readers. In response to the communications module 210 accessing transaction data at the database 126, the identification module 220 may identify common transaction codes among the transaction data of a set of compromised card readers. The identification module 220 may receive an identification of a set of compromised card readers among the set of card readers (e.g., based on card reader identifiers). The identification module 220 may flag the common transaction codes for the correlation module 230. The correlation module 230 provides functionality to define correlations between the common transaction codes identified by the identification module 220 and instances of fraud (e.g., a card skimmer). The correlation module 230 may store the correlations within the database 126.


The presentation module 240 provides functionality to generate and cause display of a notification at a client device 110. For example, in response to identifying a common transaction code correlated to an instance of fraud, the presentation module 240 may generate and cause display of a notification at a client device 110. The notification includes an indication of a card reader identifier, and corresponding transaction data that include the common transaction code.



FIG. 3 is a flowchart illustrating a method 300 for correlating existing transaction codes with an instance of fraud, according to some example embodiments. The method 300 is embodied in computer-readable instructions for execution by one or more processors such that the operations of the method 300 are performed in part or in whole by the network-based vulnerability detection system 150; accordingly, the method 300 is described below by way of example with reference thereto. However, it shall be appreciated that at least some of the operations of the method 300 may be deployed on various other hardware configurations, and the method 300 is not intended to be limited to the network-based vulnerability detection system 150.


At operation 310, the communication module 210 communicates a request to the database server 124 to access the database 126 to retrieve transaction data. The database 126 may collect and store transaction data from a set of card readers. The transaction data may for example include card reader identifiers, time stamps, transaction codes, as well as card identifiers of cards associated with the corresponding transactions.


At operation 320, the identification module 220 receives an identification of a compromised set of card readers among the set of card reader identifiers corresponding to the transaction data. The identification may be based on a set of card reader identifiers, or transaction data that corresponds to a compromised set of card readers. For example, the identification module 220 may receive a time period in which all card readers that conducted transactions were compromised, or a card identifier that was known to have been compromised, such that all card readers that include transaction data indicating the card identifier may be compromised. Based on the transaction data, the identification module 20 flags the compromised set of card readers.


At operation 330, the identification module 220 analyzes the transaction data associated with the compromised set of card readers to identify common transaction codes. The transaction codes may include preconfigured transaction codes of the card readers that are intended to indicate a status or event associated with a transaction. For example, the transaction codes may indicate: completion of a successful transaction; an incomplete or faulty card swipe; or a declined card.


At operation 340, having identified the common transaction codes within the transaction data of the set of compromised card readers, the identification module 220 transmits the common transaction codes to the correlation module 230, which then correlates the common transaction code with an instance of fraud. The instance of fraud may for example include a card skimmer installed on the set of compromised card readers. The correlation module 230 may store the correlation within the database 126.



FIG. 4 is a flowchart illustrating additional operations of the method 300, for identifying an instance of fraud based on existing transaction codes, according to some example embodiments. As discussed with reference to FIG. 3, the vulnerability detection system 150 is configured to correlate instances of fraud with preconfigured transaction codes of a card reader. The preconfigured transaction codes may not necessarily indicate a particular, specific instance of fraud.


The method 300 is embodied in computer-readable instructions for execution by one or more processors such that the operations of the method 400 are performed in part or in whole by the network-based vulnerability detection system 150; accordingly, the method 300 is described below by way of example with reference thereto. However, it shall be appreciated that at least some of the operations of the method 300 may be deployed on various other hardware configurations, and the method 300 is not intended to be limited to the network-based vulnerability detection system 150. As shown in FIG. 4, one or more operations 410, 420, 430 and 440 may be performed as part (e.g., a precursor task, a subroutine, or a portion) of the method 300, in which the vulnerability detection system 150 correlated existing transaction codes to instances of fraud, according to some example embodiments.


At operation 410, the communication module 210 monitors a set of card readers. Each card reader has an associated card reader identifier. The set of card readers may be located within a specific physical location, or at multiple physical locations. For example, the card readers may include a set of Automated Teller Machine (ATM) distributed across a particular geographic area, or may be card readers located within a single retail location of a retailer.


At operation 420, the communication module 210 receives transaction data from a card reader from among the set of card readers being monitored. The transaction data corresponds to a transaction conducted through the card reader. For example, the transaction may include a request to retrieve funds from an ATM machine, or a request to purchase items at the retail location.


At operation 430, the identification module 220 identifies the transaction code correlated with the instance of fraud within the transaction data of the card reader being monitored. In response to identifying the transaction code correlated with the instance of fraud, the presentation module 240 generates and causes display of a notification at a client device 110. The notification includes transaction data of the card reader, such as the transaction code correlated with the instance of fraud. To cause display of the notification at the client device 110, the presentation module 240 may generate a set of instructions that, when executed by the client device 110, cause the client device 110 to display the notification.


In some example embodiments, the presentation module 240 may disable, or interrupt an ongoing (or previously conducted) transaction conducted through the card reader associated with the transaction code correlated with the instance of fraud in order to prevent fraud. Additionally, the presentation module 240 may deliver a notification to a user account associated with a set of card identifiers identified within the transaction data of the card reader, due to the possibility that the cards associated with the card identifiers may be compromised.



FIG. 5 is a flowchart illustrating additional operations of the method 300 for calculating a vulnerability score of a card reader based on transaction data, consistent with some embodiments. As shown in FIG. 5, one or more operations 510, 520, 530, and 540 may be performed as part (e.g., a precursor task, a subroutine, or a portion) of the method 300, in which the vulnerability detection system 150 correlated existing transaction codes to instances of fraud, according to some example embodiments.


At operation 510, the communication module 210 accesses the database 126 to retrieve transaction data of a card reader (e.g., a first card reader) over a period of time. The period of time may be defined as a temporal period (e.g., “January 1st through January 9th,” “8:00 am through 5:00 pm,” etc.), and/or as a number of transaction conducted on the first card reader (e.g., the last fifty transactions).


At operation 520, the identification module 220 determines one or more of a rate, number, and a frequency in which the common transaction code correlated with the instance of fraud appears within the transaction data over the time period defined in operation 510. For example, the rate may indicate that the transaction code appears in every transaction conducted on the card reader, and the number may indicate a number of times in which the transaction code appears over the time period.


At operation 530, the identification module 220 calculates a vulnerability score of the first card reader based on one or more of the rate, number, and the frequency determined at operation 520. For example, the rate, number, and/or frequency in which the common transaction code appears is divided by a combination of the hourly or daily frequency of all transaction conducted through the card reader to obtain a ratio. To calculate the vulnerability score, the identification module 220 applies one or more statistical techniques (e.g., Kolmogorov-Smirnov Test) to observe the ratio over the course of the period of time (as discussed with respect to operation 510) as compared with a sample set of ratios collected from the card reader from a different period of time (e.g., the preceding week, day, etc.) and/or from similar card readers during the same time period. The card reader is thereby assigned the vulnerability score. In some example embodiments, the identification module 220 may also calculate and assign a vulnerability score to the card readers used to calculate the sample ratios. The vulnerability score indicates a likelihood that a particular card reader is vulnerable, or compromised by an instance of fraud. In some example embodiments, the vulnerability detection system 150 may also access the database 126 to determine an expected rate of the transaction code correlated with the instance of fraud in the set of card readers. Based on the expected rate, the identification module 220 may weight the vulnerability score accordingly.


In some embodiments, the vulnerability detection system 150 may cause display of the notification upon detecting a transgression of a vulnerability threshold by a vulnerability score. The vulnerability threshold may be either a maximum or a minimum value depending on the embodiment. Accordingly, depending on the embodiment, the vulnerability score may transgress the threshold score by being greater than a maximum value or by being less than a minimum value. The vulnerability threshold may be defined by a user (e.g., flag all card readers with a vulnerability score above or below X), or in some embodiments may be determined based on a normalization function.



FIG. 6 is a diagram 600 illustrating a potentially vulnerable card reader 605, consistent with some embodiments. FIG. 6 includes a card reader 605 and a card 610 (e.g., a credit card). The card reader 605 may include a preconfigured set of transaction codes, as discussed above. The transaction codes indicate a status of a transaction, or a status of the card reader 605 itself. As the card 610 is swiped through the card reader 605, the card reader 605 may determine a status of the transaction and select an appropriate transaction code based on the determined status.


For example, the card reader 605 may determine that when the card 610 was swiped, the network connectivity was limited and the transaction failed to complete. As such, the card reader 605 may have a preconfigured transaction code to indicate that the network connectivity was limited. Similarly, upon swiping the card 610, the card reader 605 may determine that it was unable to read the card 610. The card reader 605 may thereby transmit a message to the database 126 that includes an indication that the card reader 605 was unable to read the card 610.



FIG. 7 is a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. Specifically, FIG. 7 shows a diagrammatic representation of the machine 700 in the example form of a system, within which instructions 702 (e.g., software, a program, an application, an applet, an app, a driver, or other executable code) for causing the machine 700 to perform any one or more of the methodologies discussed herein may be executed. For example, the instructions 702 include executable code that causes the machine 700 to execute the method 300. In this way, these instructions 702 transform the general, non-programmed machine into a particular machine programmed to carry out the described and illustrated functions in the manner described herein. The machine 700 may operate as a standalone device or may be coupled (e.g., networked) to other machines.


By way of non-limiting example, the machine 700 may comprise or correspond to a television, a computer (e.g., a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, or a netbook), a set-top box (STB), a personal digital assistant (PDA), an entertainment media system (e.g., an audio/video receiver), a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a portable media player, or any machine capable of outputting audio signals and capable of executing the instructions 702, sequentially or otherwise, that specify actions to be taken by machine 700. Further, while only a single machine 700 is illustrated, the term “machine” shall also be taken to include a collection of machines 700 that individually or jointly execute the instructions 702 to perform any one or more of the methodologies discussed herein.


The machine 700 may include processors 704, memory 706, storage unit 708 and I/O components 710, which may be configured to communicate with each other such as via a bus 712. In an example embodiment, the processors 704 (e.g., a central processing unit (CPU), a reduced instruction set computing (RISC) processor, a complex instruction set computing (CISC) processor, a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, processor 714 and processor 716 that may execute instructions 702. The term “processor” is intended to include multi-core processors that may comprise two or more independent processors (sometimes referred to as “cores”) that may execute instructions 702 contemporaneously. Although FIG. 7 shows multiple processors 704, the machine 700 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core processor), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof.


The memory 706 (e.g., a main memory or other memory storage) and the storage unit 708 are both accessible to the processors 704 such as via the bus 712. The memory 706 and the storage unit 708 store the instructions 702 embodying any one or more of the methodologies or functions described herein. In some embodiments, the database(s) 126 resides on the storage unit 708. The instructions 702 may also reside, completely or partially, within the memory 706, within the storage unit 708, within at least one of the processors 704 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by the machine 700. Accordingly, the memory 706, the storage unit 708, and the memory of processors 704 are examples of machine-readable media.


As used herein, “machine-readable medium” means a device able to store instructions and data temporarily or permanently and may include, but is not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., erasable programmable read-only memory (EEPROM)), or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions 702. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 702) for execution by a machine (e.g., machine 700), such that the instructions, when executed by one or more processors of the machine 700 (e.g., processors 704), cause the machine 700 to perform any one or more of the methodologies described herein (e.g., methods 400 and 500). Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.


Furthermore, the “machine-readable medium” is non-transitory in that it does not embody a propagating signal. However, labeling the tangible machine-readable medium as “non-transitory” should not be construed to mean that the medium is incapable of movement—the medium should be considered as being transportable from one real-world location to another. Additionally, since the machine-readable medium is tangible, the medium may be considered to be a machine-readable device.


The I/O components 710 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 710 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that the I/O components 710 may include many other components that are not specifically shown in FIG. 7. The I/O components 710 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 710 may include input components 718 and output components 720. The input components 718 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and/or force of touches or touch gestures, or other tactile input components), audio input components, and the like. The output components 720 may include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor, resistance mechanisms), other signal generators, and so forth.


Communication may be implemented using a wide variety of technologies. The I/O components 710 may include communication components 722 operable to couple the machine 700 to a network 724 or devices 726 via coupling 728 and coupling 730, respectively. For example, the communication components 722 may include a network interface component or other suitable device to interface with the network 724. In further examples, communication components 722 may include wired communication components, wireless communication components, cellular communication components, near field communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), WiFi® components, and other communication components to provide communication via other modalities. The devices 726 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).


Modules, Components and Logic


Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A hardware module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.


In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.


Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses that connect the hardware modules). In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.


Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment, or a server farm), while in other embodiments the processors may be distributed across a number of locations.


The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., APIs).


Electronic Apparatus and System


Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, or software, or in combinations of them. Example embodiments may be implemented using a computer program product, for example, a computer program tangibly embodied in an information carrier, for example, in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, for example, a programmable processor, a computer, or multiple computers.


A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a standalone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site, or distributed across multiple sites and interconnected by a communication network.


In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry (e.g., an FPGA or an ASIC).


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that both hardware and software architectures merit consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or in a combination of permanently and temporarily configured hardware may be a design choice. Below are set out hardware (e.g., machine) and software architectures that may be deployed, in various example embodiments.


Language


Although the embodiments of the present invention have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader scope of the inventive subject matter. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof show, by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent, to those of skill in the art, upon reviewing the above description.


All publications, patents, and patent documents referred to in this document are incorporated by reference herein in their entirety, as though individually incorporated by reference. In the event of inconsistent usages between this document and those documents so incorporated by reference, the usage in the incorporated references should be considered supplementary to that of this document; for irreconcilable inconsistencies, the usage in this document controls.


In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended; that is, a system, device, article, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim.

Claims
  • 1. A method comprising: receiving data from a device among a set of devices;generating a vulnerability score of the device based on the data;receiving an input that defines a threshold value;determining the vulnerability score transgresses the threshold value; andcausing display of a notification that includes at least a device identifier of the device in response to the determining that the vulnerability score of the device transgresses the threshold value.
  • 2. The method of claim 1, wherein the data includes a code, and the generating the vulnerability score includes: accessing a database associated with the set of devices, the database comprising a correlation between the code and an instance of fraudulent activity; andgenerating the vulnerability score of the device based on the correlation between the code and the instance of fraudulent activity.
  • 3. The method of claim 1, wherein the causing display of the notification further comprises: identifying a user account associated with the set of devices in response to the determining that the vulnerability score transgresses the threshold value, the user account associated with a client device; andcausing display of the notification at the client device.
  • 4. The method of claim 1, wherein the data includes a timestamp, and the generating the vulnerability score further comprises: receiving an input that correlates a temporal period with an instance of fraudulent activity; anddetermining that the timestamp of the data received from the device is within the temporal period.
  • 5. The method of claim 1, wherein the device includes a card reader.
  • 6. The method of claim 1, wherein the causing display of the notification includes: disabling the device among the set of devices in response to the determining that the vulnerability score transgresses the threshold value.
  • 7. A system comprising: one or more processors of a machine; anda memory storing instructions that, when executed by at least one processor among the one or more processors, causes the machine to perform operations comprising:receiving data from a device among a set of devices;generating a vulnerability score of the device based on the data;receiving an input that defines a threshold value;determining the vulnerability score transgresses the threshold value; andcausing display of a notification that includes at least a device identifier of the device in response to the determining that the vulnerability score of the device transgresses the threshold value.
  • 8. The system of claim 7, wherein the data includes a code, and the generating the vulnerability score further comprises: accessing a database associated with the set of devices, the database comprising a correlation between the code and an instance of fraudulent activity; andgenerating the vulnerability score of the device based on the correlation between the code and the instance of fraudulent activity.
  • 9. The system of claim 7, wherein the causing display of the notification further comprises: identifying a user account associated with the set of devices in response to the determining that the vulnerability score transgresses the threshold value, the user account associated with a client device; andcausing display of the notification at the client device.
  • 10. The system of claim 7, wherein the data includes a timestamp, and the generating the vulnerability score further comprises: receiving an input that correlates a temporal period with an instance of fraudulent activity; anddetermining that the timestamp of the data received from the device is within the temporal period.
  • 11. The system of claim 7, wherein the device includes a card reader.
  • 12. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising: receiving data from a device among a set of devices;generating a vulnerability score of the device based on the data;receiving an input that defines a threshold value;determining the vulnerability score transgresses the threshold value; andcausing display of a notification that includes at least a device identifier of the device in response to the determining that the vulnerability score of the device transgresses the threshold value.
  • 13. The non-transitory machine-readable storage medium of claim 12, wherein the data includes a code, and the generating the vulnerability score includes: accessing a database associated with the set of devices, the database comprising a correlation between the code and an instance of fraudulent activity; andgenerating the vulnerability score of the device based on the correlation between the code and the instance of fraudulent activity.
  • 14. The non-transitory machine-readable storage medium of claim 12, wherein the causing display of the notification further comprises: identifying a user account associated with the set of devices in response to the determining that the vulnerability score transgresses the threshold value, the user account associated with a client device; andcausing display of the notification at the client device.
  • 15. The non-transitory machine-readable storage medium of claim 12, wherein the data includes a timestamp, and the generating the vulnerability score further comprises: receiving an input that correlates a temporal period with an instance of fraudulent activity; anddetermining that the timestamp of the data received from the device is within the temporal period.
  • 16. The non-transitory machine-readable storage medium of claim 12, wherein the device includes a card reader.
  • 17. The non-transitory machine-readable storage medium of claim 12, wherein the causing display of the notification includes: disabling the device among the set of devices in response to the determining that the vulnerability score transgresses the threshold value.
PRIORITY APPLICATION

This application is a continuation of U.S. patent application Ser. No. 16/169,122, filed Oct. 24, 2018, which is a continuation of U.S. patent application Ser. No. 15/724,946, filed Oct. 4, 2017, which is a continuation of U.S. patent application Ser. No. 15/357,655, filed Nov. 21, 2016, the disclosures of which are incorporated herein in their entireties by reference.

US Referenced Citations (216)
Number Name Date Kind
5819226 Gopinathan et al. Oct 1998 A
5892900 Ginter et al. Apr 1999 A
6094643 Anderson et al. Jul 2000 A
6430305 Decker Aug 2002 B1
6820135 Dingman et al. Nov 2004 B1
6978419 Kantrowitz Dec 2005 B1
6980984 Huffman et al. Dec 2005 B1
7168039 Bertram Jan 2007 B2
7617232 Gabbert et al. Nov 2009 B2
7756843 Palmer Jul 2010 B1
7899796 Borthwick et al. Mar 2011 B1
7917376 Bellin et al. Mar 2011 B2
7941321 Greenstein et al. May 2011 B2
8036971 Aymeloglu et al. Oct 2011 B2
8046283 Burns et al. Oct 2011 B2
8054756 Chand et al. Nov 2011 B2
8214490 Vos et al. Jul 2012 B1
8229902 Vishniac et al. Jul 2012 B2
8290838 Thakur et al. Oct 2012 B1
8302855 Ma et al. Nov 2012 B2
8473454 Evanitsky et al. Jun 2013 B2
8484115 Aymeloglu et al. Jul 2013 B2
8589273 Creeden et al. Nov 2013 B2
8600872 Yan Dec 2013 B1
8666861 Li et al. Mar 2014 B2
8688573 Rukonic et al. Apr 2014 B1
8744890 Bernier et al. Jun 2014 B1
8798354 Bunzel et al. Aug 2014 B1
8812960 Sun et al. Aug 2014 B1
8924388 Elliot et al. Dec 2014 B2
8924389 Elliot et al. Dec 2014 B2
8938686 Erenrich et al. Jan 2015 B1
8949164 Mohler Feb 2015 B1
9032531 Scorvo et al. May 2015 B1
9100428 Visbal Aug 2015 B1
9129219 Robertson et al. Sep 2015 B1
9412108 Wang et al. Aug 2016 B2
9842338 Shukla et al. Dec 2017 B1
10176482 Shukla Jan 2019 B1
10796318 Shukla Oct 2020 B2
20010027424 Torigoe Oct 2001 A1
20020065708 Senay et al. May 2002 A1
20020095360 Joao Jul 2002 A1
20020095658 Shulman et al. Jul 2002 A1
20020103705 Brady Aug 2002 A1
20020147805 Leshem et al. Oct 2002 A1
20030126102 Borthwick Jul 2003 A1
20040034570 Davis Feb 2004 A1
20040111480 Yue Jun 2004 A1
20040153418 Hanweck Aug 2004 A1
20040236688 Bozeman Nov 2004 A1
20050010472 Quatse et al. Jan 2005 A1
20050086207 Heuer et al. Apr 2005 A1
20050133588 Williams Jun 2005 A1
20050149455 Bruesewitz et al. Jul 2005 A1
20050154628 Eckart et al. Jul 2005 A1
20050154769 Eckart et al. Jul 2005 A1
20060026120 Carolan et al. Feb 2006 A1
20060143034 Rothermel et al. Jun 2006 A1
20060143075 Carr et al. Jun 2006 A1
20060143079 Basak et al. Jun 2006 A1
20070000999 Kubo et al. Jan 2007 A1
20070011304 Error Jan 2007 A1
20070038646 Thota Feb 2007 A1
20070061259 Zoldi et al. Mar 2007 A1
20070106582 Baker et al. May 2007 A1
20070150801 Chidlovskii et al. Jun 2007 A1
20070156673 Maga et al. Jul 2007 A1
20070185867 Maga et al. Aug 2007 A1
20070239606 Eisen Oct 2007 A1
20070284433 Domenica et al. Dec 2007 A1
20080046481 Gould Feb 2008 A1
20080069081 Chand et al. Mar 2008 A1
20080103798 Domenikos May 2008 A1
20080103996 Forman et al. May 2008 A1
20080140576 Lewis et al. Jun 2008 A1
20080222038 Eden et al. Sep 2008 A1
20080222295 Robinson et al. Sep 2008 A1
20080243711 Aymeloglu et al. Oct 2008 A1
20080255973 El Wade et al. Oct 2008 A1
20080301042 Patzer Dec 2008 A1
20080313132 Hao et al. Dec 2008 A1
20090018996 Hunt et al. Jan 2009 A1
20090076845 Bellin et al. Mar 2009 A1
20090094166 Aymeloglu et al. Apr 2009 A1
20090106178 Chu Apr 2009 A1
20090112745 Stefanescu Apr 2009 A1
20090125359 Knapic et al. May 2009 A1
20090125459 Norton et al. May 2009 A1
20090187546 Whyte Jul 2009 A1
20090187548 Ji et al. Jul 2009 A1
20090228365 Tomchek et al. Sep 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090271343 Vaiciulis et al. Oct 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090307049 Elliott, Jr. et al. Dec 2009 A1
20090313463 Pang et al. Dec 2009 A1
20090319418 Herz Dec 2009 A1
20090319891 MacKinlay et al. Dec 2009 A1
20100030722 Goodson et al. Feb 2010 A1
20100031141 Summers et al. Feb 2010 A1
20100042922 Bradateanu et al. Feb 2010 A1
20100057622 Faith Mar 2010 A1
20100070842 Aymeloglu et al. Mar 2010 A1
20100094765 Nandy Apr 2010 A1
20100098318 Anderson Apr 2010 A1
20100114887 Conway et al. May 2010 A1
20100131502 Fordham May 2010 A1
20100161735 Sharma Jun 2010 A1
20100169192 Zoldi et al. Jul 2010 A1
20100191563 Schlalfer et al. Jul 2010 A1
20100235915 Memon et al. Sep 2010 A1
20100262688 Hussain et al. Oct 2010 A1
20100312837 Bodapati et al. Dec 2010 A1
20110004626 Naeymi-rad et al. Jan 2011 A1
20110055074 Chen et al. Mar 2011 A1
20110061013 Bilicki et al. Mar 2011 A1
20110078173 Seligmann et al. Mar 2011 A1
20110093327 Fordyce, III et al. Apr 2011 A1
20110099133 Chang et al. Apr 2011 A1
20110099628 Lanxner et al. Apr 2011 A1
20110131122 Griffin et al. Jun 2011 A1
20110153384 Horne et al. Jun 2011 A1
20110173093 Psota et al. Jul 2011 A1
20110208565 Ross et al. Aug 2011 A1
20110213655 Henkin et al. Sep 2011 A1
20110218955 Tang et al. Sep 2011 A1
20110225586 Bentley et al. Sep 2011 A1
20110231305 Winters Sep 2011 A1
20110270604 Qi et al. Nov 2011 A1
20110270834 Sokolan et al. Nov 2011 A1
20110289397 Eastmond et al. Nov 2011 A1
20110295649 Fine et al. Dec 2011 A1
20110307382 Siegel Dec 2011 A1
20110314007 Dassa et al. Dec 2011 A1
20110314024 Chang et al. Dec 2011 A1
20120011238 Rathod Jan 2012 A1
20120011245 Gillette et al. Jan 2012 A1
20120022945 Falkenborg et al. Jan 2012 A1
20120054284 Rakshit Mar 2012 A1
20120059853 Jagota Mar 2012 A1
20120066166 Curbera et al. Mar 2012 A1
20120079363 Folting et al. Mar 2012 A1
20120084117 Tavares et al. Apr 2012 A1
20120084287 Lakshminarayan et al. Apr 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120158585 Ganti Jun 2012 A1
20120159362 Brown et al. Jun 2012 A1
20120173381 Smith Jul 2012 A1
20120215784 King et al. Aug 2012 A1
20120221553 Wittmer et al. Aug 2012 A1
20120226523 Weiss et al. Sep 2012 A1
20120245976 Kumar et al. Sep 2012 A1
20120278249 Duggal et al. Nov 2012 A1
20120323888 Osann, Jr. Dec 2012 A1
20130016106 Yip et al. Jan 2013 A1
20130054306 Bhaila et al. Feb 2013 A1
20130057551 Ebert et al. Mar 2013 A1
20130096988 Grossman et al. Apr 2013 A1
20130110746 Ahn May 2013 A1
20130151453 Bhanot et al. Jun 2013 A1
20130166348 Scotto Jun 2013 A1
20130166480 Popescu et al. Jun 2013 A1
20130185245 Anderson Jul 2013 A1
20130185307 El-yaniv et al. Jul 2013 A1
20130226318 Procyk et al. Aug 2013 A1
20130238616 Rose et al. Sep 2013 A1
20130238664 Hsu et al. Sep 2013 A1
20130246170 Gross et al. Sep 2013 A1
20130246537 Gaddala Sep 2013 A1
20130246597 Iizawa et al. Sep 2013 A1
20130262328 Federgreen Oct 2013 A1
20130263019 Castellanos Oct 2013 A1
20130282696 John et al. Oct 2013 A1
20130290825 Arndt et al. Oct 2013 A1
20130297619 Chandrasekaran et al. Nov 2013 A1
20130304770 Boero et al. Nov 2013 A1
20130325826 Agarwal et al. Dec 2013 A1
20140012724 O'leary Jan 2014 A1
20140012796 Petersen et al. Jan 2014 A1
20140040371 Gurevich et al. Feb 2014 A1
20140058914 Song et al. Feb 2014 A1
20140068487 Steiger et al. Mar 2014 A1
20140095363 Caldwell Apr 2014 A1
20140095509 Patton Apr 2014 A1
20140108380 Gotz et al. Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140123279 Bishop et al. May 2014 A1
20140136285 Carvalho May 2014 A1
20140143009 Brice et al. May 2014 A1
20140156527 Grigg et al. Jun 2014 A1
20140157172 Peery et al. Jun 2014 A1
20140164502 Khodorenko et al. Jun 2014 A1
20140189536 Lange et al. Jul 2014 A1
20140195515 Baker et al. Jul 2014 A1
20140222521 Chait Aug 2014 A1
20140222752 Isman et al. Aug 2014 A1
20140222793 Sadkin et al. Aug 2014 A1
20140229554 Grunin et al. Aug 2014 A1
20140344230 Krause et al. Nov 2014 A1
20140358789 Boding et al. Dec 2014 A1
20140358829 Hurwitz Dec 2014 A1
20140366132 Stiansen et al. Dec 2014 A1
20150073929 Psota et al. Mar 2015 A1
20150073954 Braff Mar 2015 A1
20150095773 Gonsalves et al. Apr 2015 A1
20150100897 Sun et al. Apr 2015 A1
20150106379 Elliot et al. Apr 2015 A1
20150134512 Mueller May 2015 A1
20150135256 Hoy et al. May 2015 A1
20150161611 Duke et al. Jun 2015 A1
20150188872 White Jul 2015 A1
20150338233 Cervelli et al. Nov 2015 A1
20150379413 Robertson et al. Dec 2015 A1
20160004764 Chakerian et al. Jan 2016 A1
20190057398 Shukla et al. Feb 2019 A1
Foreign Referenced Citations (13)
Number Date Country
102546446 Jul 2012 CN
103167093 Jun 2013 CN
102054015 May 2014 CN
102014204827 Sep 2014 DE
102014204830 Sep 2014 DE
102014204834 Sep 2014 DE
2487610 Aug 2012 EP
2858018 Apr 2015 EP
2869211 May 2015 EP
2889814 Jul 2015 EP
2892197 Jul 2015 EP
2963595 Jan 2016 EP
WO-2005116851 Dec 2005 WO
Non-Patent Literature Citations (122)
Entry
“5 Great Tools for Visualizing your Twitter Followers”, Amnet Blog, [Online] Retrieved from the Internet: <URL: http://www.amnetblog.com/component/content/article/115-5-great-tools-for-visualizing-your-twitter-followers.html>, (Aug. 4, 2010), 1-5.
“About OWA”, Open Web Analytics, [Online]. Retrieved from the Internet: <URL: http://www.openwebanalytics.com/?page jd=2>, (Accessed: Jul. 19, 2013), 5 pgs.
“An Introduction to KeyLines and Network Visualization”, Keylines.com, [Online]. Retrieved from the Internet: <URL: http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf>, (Mar. 2014), 8 pgs.
“Analytics For Data Driven Startups”, Trak.io, [Online], Retrieved from the Internet: <URL: http://trak.io/>, (Accessed: Jul. 18, 2013), 3 pgs.
“U.S. Appl. No. 13/827,491, Final Office Action dated Jun. 22, 2015”, 28 pgs.
“U.S. Appl. No. 13/827,491, Non Final Office Action dated Mar. 30, 2016”, 25 pgs.
“U.S. Appl. No. 13/827,491, Non Final Office Action dated Oct. 9, 2015”, 16 pgs.
“U.S. Appl. No. 13/827,491, Non Final Office Action dated Dec. 1, 2014”, 5 pgs.
“U.S. Appl. No. 14/141,252, Non Final Office Action dated Oct. 8, 2015”, 11 pgs.
“U.S. Appl. No. 14/225,006, Advisory Action dated Dec. 21, 2015”, 4 pgs.
“U.S. Appl. No. 14/225,006, Final Office Action dated Sep. 2, 2015”, 28 pgs.
“U.S. Appl. No. 14/225,006, First Action Interview Pre-Interview Communication dated Feb. 27, 2015”, 5 pgs.
“U.S. Appl. No. 14/225,006, First Action Interview Pre-Interview Communication dated Sep. 10, 2014”, 4 pgs.
“U.S. Appl. No. 14/225,084, Examiner Interview Summary dated Jan. 4, 2016”, 3 pgs.
“U.S. Appl. No. 14/225,084, First Action Interview Pre-Interview Communication dated Feb. 20, 2015”, 5 pgs.
“U.S. Appl. No. 14/225,084, First Action Interview Pre-Interview Communication dated Sep. 2, 2014”, 17 pgs.
“U.S. Appl. No. 14/225,084, Non Final Office Action dated Sep. 11, 2015”, 13 pgs.
“U.S. Appl. No. 14/225,084, Notice of Allowance dated May 4, 2015”, 26 pgs.
“U.S. Appl. No. 14/225,160, Advisory Action dated May 20, 2015”, 7 pgs.
“U.S. Appl. No. 14/225,160, Final Office Action dated Feb. 11, 2015”, 30 pgs.
“U.S. Appl. No. 14/225,160, First Action Interview Pre-Interview Communication dated Jul. 29, 2014”, 19 pgs.
“U.S. Appl. No. 14/225,160, First Action Interview Pre-Interview Communication dated Oct. 22, 2014”, 6 pgs.
“U.S. Appl. No. 14/225,160, Non Final Office Action dated Aug. 12, 2015”, 23 pgs.
“U.S. Appl. No. 14/306,138, Examiner Interview Summary dated Dec. 3, 2015”, 3 pgs.
“U.S. Appl. No. 14/306,138, Examiner Interview Summary dated Dec. 24, 2015”, 5 pgs.
“U.S. Appl. No. 14/306,147, Final Office Action dated Dec. 24, 2015”, 22 pgs.
“U.S. Appl. No. 14/319,161, Final Office Action dated Jan. 23, 2015”, 21 pgs.
“U.S. Appl. No. 14/319,161, Notice of Allowance dated May 4, 2015”, 6 pgs.
“U.S. Appl. No. 14/323,935, Notice of Allowance dated Oct. 1, 2015”, 8 pgs.
“U.S. Appl. No. 14/451,221, Non Final Office Action dated Oct. 21, 2014”, 16 pgs.
“U.S. Appl. No. 14/463,615, Advisory Action dated Sep. 10, 2015”, 3 pgs.
“U.S. Appl. No. 14/463,615, Final Office Action dated May 21, 2015”, 31 pgs.
“U.S. Appl. No. 14/463,615, First Action Interview Pre-interview Communication dated Jan. 28, 2015”, 29 pgs.
“U.S. Appl. No. 14/463,615, First Action Interview Pre-Interview Communication dated Nov. 13, 2014”, 4 pgs.
“U.S. Appl. No. 14/463,615, Non Final Office Action dated Dec. 9, 2015”, 44 pgs.
“U.S. Appl. No. 14/479,863, First Action Interview Pre-Interview Communication dated Dec. 26, 2014”, 5 pgs.
“U.S. Appl. No. 14/479,863, Notice of Allowance dated Mar. 31, 2015”, 23 pgs.
“U.S. Appl. No. 14/483,527, Final Office Action dated Jun. 22, 2015”, 17 pgs.
“U.S. Appl. No. 14/483,527, First Action Interview Pre-interview Communication dated Jan. 28, 2015”, 6 pgs.
“U.S. Appl. No. 14/483,527, Non Final Office Action dated Oct. 28, 2015”, 20 pgs.
“U.S. Appl. No. 14/516,386, Applicant-initiated Interview Summary dated Jun. 30, 2016”, 5 pgs.
“U.S. Appl. No. 14/516,386, First Action Interview Pre-Interview Communication dated Feb. 24, 2016”, 16 pgs.
“U.S. Appl. No. 14/552,336, First Action Interview Pre-interview Communication dated Jul. 20, 2015”, 18 pgs.
“U.S. Appl. No. 14/552,336, Notice of Allowance dated Nov. 3, 2015”, 13 pgs.
“U.S. Appl. No. 14/562,524, First Action Interview Pre-interview Communication dated Sep. 14, 2015”, 12 pgs.
“U.S. Appl. No. 14/562,524, First Action Interview Pre-Interview Communication dated Nov. 10, 2015”, 6 pgs.
“U.S. Appl. No. 14/571,098, First Action Interview dated Aug. 24, 2015”, 4 pgs.
“U.S. Appl. No. 14/571,098, First Action Interview Pre-interview Communication dated Mar. 11, 2015”, 4 pgs.
“U.S. Appl. No. 14/571,098, First Action Interview Pre-Interview Communication dated Aug. 5, 2015”, 4 pgs.
“U.S. Appl. No. 14/571,098, First Action Interview Pre-interview Communication dated Nov. 10, 2015”, 5 pgs.
“U.S. Appl. No. 14/676,621, First Action Interview Pre-Interview Communication dated Sep. 10, 2015”, 5 pgs.
“U.S. Appl. No. 14/676,621, Examiner Interview Summary dated Jul. 30, 2015”, 5 pgs.
“U.S. Appl. No. 14/676,621, Final Office Action dated Oct. 29, 2015”, 10 pgs.
“U.S. Appl. No. 14/746,671, First Action Interview Pre-Interview Communication dated Nov. 12, 2015”, 19 pgs.
“U.S. Appl. No. 14/746,671, Notice of Allowance dated Jan. 21, 2016”, 7 pgs.
“U.S. Appl. No. 14/800,447, First Action Interview—Pre-Interview Communication dated Dec. 10, 2015”, 6 pgs.
“U.S. Appl. No. 14/813,749, Non Final Office Action dated Sep. 28, 2015”, 22 pgs.
“U.S. Appl. No. 14/842,734, First Action Interview Pre-Interview Communication dated Nov. 19, 2015”, 17 pgs.
“U.S. Appl. No. 14/923,364, Notice of Allowance dated May 6, 2016”, 36 pgs.
“U.S. Appl. No. 14/923,374, First Action Interview—Pre-Interview Communication dated Feb. 9, 2016”, 4 pgs.
“U.S. Appl. No. 14/923,374, First Action Interview Pre-Interview Communication dated May 23, 2016”, 29 pgs.
“U.S. Appl. No. 15/017,324, First Action Interview Pre-Interview Communication dated Apr. 22, 2016”, 20 pgs.
“U.S. Appl. No. 15/357,655, Final Office Action dated Jun. 9, 2017”, 5 pgs.
“U.S. Appl. No. 15/357,655, First Action interview—Pre-Interview Communication dated Feb. 9, 2017”, 6 pgs.
“U.S. Appl. No. 15/357,655, Notice of Allowance dated Aug. 10, 2017”, 8 pgs.
“U.S. Appl. No. 16/169,122, First Action Interview—Pre-Interview Communication dated Feb. 8, 2018”, 4 pgs.
“U.S. Appl. No. 16/169,122, First Action Interview—Office Action Summary dated Dec. 5, 2019”, 7 pgs.
“U.S. Appl. No. 16/169,122, First Action Interview—Pre-Interview Communication dated Aug. 22, 2019”, 7 pgs.
“U.S. Appl. No. 16/169,122, Notice of Allowance dated Jun. 4, 2020”, 7 pgs.
“U.S. Appl. No. 16/169,122, Response filed Feb. 5, 20 to First Action Interview—Office Action Summary dated Dec. 5, 2019”, 9 pgs.
“Apsalar—Mobile App Analytics & Advertising”, Data Powered Mobile Advertising, https://apsalar.com/, (Jul. 18, 2013), 1-8.
“Beta Testing On The Fly”, TestFlight, [Online]. Retrieved from the Internet: <URL: https://testflightapp. com/>, (Accessed: Jul. 18, 2013), 3 pgs.
“Countly”, Countly Mobile Analytics, [Online]. Retrieved from the Internet: <URL: http://count.ly/products/screenshots, (accessed Jul. 18, 2013), 9 pgs.
“DISTIMO—App Analytics”, [Online]. Retrieved from the Internet: <URL: http://www.distimo.com/app-analytics, (accessed Jul. 18, 2013), 5 pgs.
“European Application Serial No. 14187996.5, Extended European Search Report dated Feb. 12, 2015”, 7 pgs.
“European Application Serial No. 14191540.5, Extended European Search Report dated May 27, 2015”, 9 pgs.
“European Application Serial No. 14200246.8, Extended European Search Report dated May 29, 2015”, 8 pgs.
“European Application Serial No. 14200298.9, Extended European Search Report dated May 13, 2015”, 7 pgs.
“European Application Serial No. 15181419.1, Extended European Search Report dated Sep. 29, 2015”, 7 pgs.
“European Application Serial No. 15184764.7, Extended European Search Report dated Dec. 14, 2015”, 8 pgs.
“Flurry Analytics”, [Online], Retrieved from the Internet: <URL: http://www.flurry.com/, (accessed Jul. 18, 2013), 14 pgs.
“Google Analytics Official Website—Web Analytics & Reporting”, [Online], Retrieved from the Internet: <URL: http ://www.googie.com/ analytics/index.html, (accessed Jul. 18, 2013), 22 pgs.
“Great Britain Application Serial No. 1404486.1, Combined Search Report and Examination Report dated Aug. 27, 2014”, 5 pgs.
“Great Britain Application Serial No. 1404486.1, Office Action dated May 21, 2015”, 2 pgs.
“Great Britain Application Serial No. 1404489.5, Combined Search Report and Examination Report dated Aug. 27, 2014”, 5 pgs.
“Great Britain Application Serial No. 1404489.5, Office Action dated May 21, 2015”, 3 pgs.
“Great Britain Application Serial No. 1404489.5, Office Action dated Oct. 6, 2014”, 1 pg.
“Great Britain Application Serial No. 1404499.4, Combined Search Report and Examination Report dated Aug. 20, 2014”, 6 pgs.
“Great Britain Application Serial No. 1404499.4, Office Action dated Jun. 11, 2015”, 5 pgs.
“Great Britain Application Serial No. 1404499.4, Office Action dated Sep. 29, 2014”, 1 pg.
“Help File for ModelRisk Version 5—Part 1”, Vose Software, (2007), 375 pgs.
“Help File for ModelRisk Version 5—Part 2”, Vose Software, (2007), 362 pgs.
“Hunchlab: Heat Map and Kernel Density Calculation for Crime Analysis”, Azavea Journal, [Online], Retrieved from the Internet: <URL: www.azavea.com/blogs/newsletter/v4i4/kernel-density-capabilities-added-to-hunchlab>, (Sep. 9, 2014), 2 pgs.
“KeyLines Datasheet”, Keylines.com, [Online]. Retrieved from the Internet: <URL: http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf>, (Mar. 2014), 2 pgs.
“Mixpanel: Actions speak louder than page views”, Mobile Analytics, [Online], Retrieved from the Internet: <URL: https://mixpanel.com/>, (Accessed: Jul. 18, 2013), 13 pgs.
“Mobile App Marketing & Analytics”, Localytics, [Online]. Retrieved from the Internet: <URL: http://www.localytics.com/>, (Accessed: Jul. 18, 2013), 12 pgs.
“More than android analytics”, UserMetrix, [Online]. Retrieved from the Internet: <URL: http://usermetrix.com/android-analytics>, (Accessed: Jul. 18, 2013), 3 pgs.
“More Than Mobile Analytics”, Kontagent, [Online]. Retrieved from the Internet: <URL: http://www. kontagent. com/>, (Accessed: Jul. 18, 2013), 9 pgs.
“Multimap”, Wikipedia, [Online]. Retrieved from the Internet: <URL: https://en.wikipedia.org/w/index.php?title=Multimap&oldid=530800748>, (Jan. 1, 2013), 2 pgs.
“Netherlands Application Serial No. 2012417, Netherlands Search Report dated Sep. 18, 2015”, W/ English Translation, 9 pgs.
“Netherlands Application Serial No. 2012421, Netherlands Search Report dated Sep. 18, 2015”, 8 pgs.
“Netherlands Application Serial No. 2012438, Search Report dated Sep. 21, 2015”, 8 pgs.
“New Zealand Application Serial No. 622473, First Examination Report dated Mar. 27, 2014”, 3 pgs.
“New Zealand Application Serial No. 622473, Office Action dated Jun. 19, 2014”, 2 pgs.
“New Zealand Application Serial No. 622513, Office Action dated Apr. 3, 2014”, 2 pgs.
“New Zealand Application Serial No. 628161, First Examination Report dated Aug. 25, 2014”, 2 pgs.
“Piwik—Free Web Analytics Software”, Piwik, [Online]. Retrieved from the Internet: <URL: http://piwik.org/>, (Accessed: Jul. 19, 2013), 18 pgs.
“Realtime Constant Customer Touchpoint”, Capptain—Pilot your apps, [Online] Retrieved from the Internet: <URL: http://www.capptain.com>, (accessed Jul. 18, 2013), 6 pgs.
“Refresh CSS ellipsis when resizing container”, Stack Overflow, [Online]. Retrieved from the Internet: <URL: http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container>, Accessed: May 18, 2015, (Jul. 31, 2013), 1 pg.
“Smart Thinking for Super Apps”, Appacts: Open Source Mobile Analytics Platform, [Online] Retrieved from the Internet: <URL: http://www.appacts.com>, (Jul. 18, 2013), 1-4.
“Visualizing Threats: Improved Cyber Security Through Network Visualization”, Keylines.com, [Online] retrieved from the internet: <http://keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf>, (May 12, 2014), 10 pgs.
“Welcome to StatCounter—Visitor Analysis for Your Website”, StatCounter—Free Invisible Web Tracker, Hit Counter and Web Stats, [Online]. Retrieved from the Internet: <URL: http://statcounter.com/>, (Accessed: Jul. 19, 2013), 17 pgs.
Chaudhuri, Surajit, et al., “An Overview of Business Intelligence Technology”, Communications of the ACM, vol. 54, No. 8., (Aug. 2011), 88-98.
Cohn, David, et al., “Semi-supervised Clustering with User Feedback”, Cornell University, Constrained Clustering: Advances in Algorithms, Theory, and Applications 4.1, (2003), 9 pgs.
Gorr, et al., “Crime Hot Spot Forecasting: Modeling and Comparative Evaluation”, Grant 98-IJ-CX-K005, (May 6, 2002), 37 pgs.
Gu, Lifang, et al., “Record Linkage: Current Practice and Future Directions”, (Jan. 15, 2004), 32 pgs.
Hansen, D., et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Chapter 4, pp. 53-67 and Chapter 10, pp. 143-164, (Sep. 2010), 53-67; 143-164.
Hua, Yu, et al., “A Muiti-attribute Data Structure with Parallel Bloom Filters for Network Services”, HiPC 2006, LNCS 4297, (2006), 277-288.
Manno, et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture”, (2010), 10 pgs.
Sigrist, Christian, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation”, Nucleic Acids Research, vol. 38, (2010), D161-D166.
Valentini, Giorgio, et al., “Ensembles of Learning Machines”, Lecture Notes in Computer Science: Neural Nets, Springer Berlin Heidelberg, (Sep. 26, 2002), 3-20.
Wang, Guohua, et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter”, IEEE, (2010), 5 pgs.
Related Publications (1)
Number Date Country
20210042763 A1 Feb 2021 US
Continuations (3)
Number Date Country
Parent 16169122 Oct 2018 US
Child 17003029 US
Parent 15724946 Oct 2017 US
Child 16169122 US
Parent 15357655 Nov 2016 US
Child 15724946 US