Conditional behavioural biometrics

Information

  • Patent Grant
  • 11238349
  • Patent Number
    11,238,349
  • Date Filed
    Tuesday, June 16, 2020
    3 years ago
  • Date Issued
    Tuesday, February 1, 2022
    2 years ago
Abstract
The present invention relates to an improved method of providing identification of a user or authentication of a user's identity. More particularly, the present invention relates to an improved method of providing identification of a user or authentication of a user's identity using conditional behavioural biometrics.
Description
FIELD OF THE INVENTION

The present invention relates to an improved method of providing identification of a user or authentication of a user's identity. More particularly, the present invention relates to an improved method of providing identification of a user or authentication of a user's identity using conditional behavioural biometrics.


BACKGROUND

For the purposes of this specification, identification typically involves the collection of data and a determination of who a user is from a database of users while authentication typically involves the use of data to confirm a user is who they present themselves to be (i.e. to verify a user's identity).


Identification and/or authentication of a user identity is an essential step in accessing many secure services or devices, such as banking, stored personal details or other restricted data. This identification and/or authentication is usually achieved by the use of passwords or personal identification numbers (PINs), which are usually assumed to be known only by the authorised user or users of a service or device.


However, knowledge of a user's password or PIN is enough for an unauthorised third party to gain access to the service or device. Additional layers of security or improved security are therefore required to reduce the risk of passwords and PINs from being used by unauthorised third parties.


Adding further security measures to the authentication process usually requires a trade-off between the increased level of security and the degradation of the user experience.


SUMMARY OF THE INVENTION

The present invention seeks to provide an enhanced method of authenticating and/or identifying a user identity using conditional behavioural biometrics.


According to a first aspect of the present invention, there is provided a method of generating a user profile for use in identifying and/or authenticating a user on a device, the device equipped with one or more sensors, the method comprising: generating a set of data points from sensory data collected by the one or more sensors; clustering the set of data points to produce a set of data clusters; developing a first classifier for the data clusters, the first classifier being operable to assign a further data point derived from a further user interaction with the computing device to one of the data clusters; and developing one or more further classifiers for at least one of the data clusters, the further classifier operable to identify and/or authenticate a user identity based on the further data point.


According to a second aspect of the invention, there is provided a method of identifying and/or authenticating a user on a device, the device equipped with one or more sensors, the method comprising: generating a data point from sensory data derived from a user interaction with the device; assigning the data point to a cluster of data points using a first classifier, the first classifier developed from a plurality of previous user interactions with the device; and applying a second classifier to the data point, the second classifier being chosen based on the assigned data cluster, and operable to identify and/or authenticate a user identity based on the further data point.


By classifying sensory data based on previous user interactions with a computing device, an additional layer of security can be provided over solely using a password or PIN input for security as a device can identify and/or authenticate a user separately to the user providing credentials to identify or authenticate themselves (or identify then authenticate themselves). The classifier is developed from a plurality of previous user interactions with the computing device. Using conditional behavioural biometrics can remove the trade-off between security versus any degradation of the user experience necessary when using PINs or passwords to authenticate a user. Optionally, the classifier can be based on biometric and/or behavioural data collected, further optionally where the classifier can be based on biometric and/or behavioural data collected during a user interaction with the device. By clustering user data while generating a user profile, different user contexts can be identified and a separate identification and/or authentication classifier can be trained for each one. Having context specific classifiers for identification and/or authentication can allow for a higher accuracy of identification and/or authentication in comparison with using a single classifier for every situation, since a user will in general interact with a computing device differently depending on the context. Herein, sensory data is used to connote data from or derived from the sensors—i.e. sensor data. A classifier is taken to connote any algorithm for statistical classification of a data set.


Optionally, the sensory data is collected during a plurality of user interactions with the device.


By using the data collected during multiple user interactions with the device to develop the user profile a more accurate set of classifiers may be obtained.


Optionally, an algorithm is used to perform the clustering of the data points derived from the sensory data.


Optionally, the algorithm is one or more clustering algorithms based on one of a K-means algorithm or a Gaussian Mixture Model using an Expectation-Maximisation algorithm.


K-means algorithms and Gaussian Mixture Models can be used to efficiently cluster the user data during the generation of a user profile.


Optionally, the identification and/or authentication of the user further comprises the step of identifying and/or authenticating the user identity using the further classifiers.


Using the classifiers to identify and/or authenticate the user identity can remove the need for the user to manually enter identification and/or authentication data, enhancing the user experience.


Optionally, the output of the second classifier is a confidence score in the user identity and/or authentication.


Optionally, if the confidence score is below a pre-defined threshold, further identification and/or authentication steps are carried out or the user is not identified and/or authenticated.


By providing a confidence score and using it to determine if further identification and/or authentication steps are required, the error rate associated with false negatives can be reduced.


Optionally, the computing device comprises at least one of: a mobile device; a local server; a cloud or network based server; and a desktop computer.


The profile generation and identification/authentication methods can be performed locally on the user device, remotely on a server or on a combination of the two. This allows for greater flexibility in the implementation of the method.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the present invention will now be described, by way of example only and with reference to the accompanying drawings having like-reference numerals, in which:



FIG. 1 is a flow chart illustrating the enrolment stage of the method according to an embodiment; and



FIG. 2 is a flow chart illustrating the authentication stage of the method according to an embodiment;





SPECIFIC DESCRIPTION

Referring to FIGS. 1 and 2, an exemplary embodiment of the method will now be described.


The method relates to providing enhanced identification and/or authentication of the identity of a user on a computing device, by using the user's behaviour while interacting with the device. The method is principally divided into two steps: (a) generating a user profile (herein referred to as the “enrolment stage”) based on data collected by the device, optionally only while a user is interacting with the device and further optionally when the user is inputting passwords, PINs or any identity of security data; and (b) the authentication stage, wherein the user profile generated in the enrolment stage is used to authenticate a user identity based on behavioural data collected while inputting a password, PIN or any identity or security data.


Referring to FIG. 1, the enrolment stage of the method will now be described according to an embodiment of the method.


The object of the enrolment stage of the method is to generate a user profile from data generated during a user's interactions with a computing device. The user profile comprises a set of K data clusters corresponding to different contexts to the user activity, a classifier C for assigning new data points to clusters, and a further classifier Ck associated with each cluster for determining whether a user is authentic or not from sensory data collected while the user is entering their details.


The user activity contexts include, but are not limited to, the user location, the user activity (for example the user's movement speed or whether the user is listening to music), the computing device's connections or the device orientation (for example, if the computing device is a mobile device, whether it is the portrait or landscape orientation).


The use of different classifiers for authenticating the user, taking into account the context, allows for a higher degree of accuracy in authenticating the user identity than the use of a single classifier for all situations. For example, a user interacting with a device held in their hand while walking will interact with the device differently to a device on a table in front of them while sitting. The orientation of the device (i.e. whether it is in portrait or landscape modes) may also affect how the user interacts with the device.


During use, a user can interact with a computing device to input, for example, a password or PIN for accessing a secure service or the device. The computing device may be, for example, a personal computer (such as a desktop or laptop computer), mobile computing device (such as a mobile telephone or tablet) or a fixed terminal (such as an ATM or touchscreen kiosk). The computing device is equipped with one or more sensors for measuring certain properties of the user's interaction with the device, and/or environmental properties during the interaction and/or in the background during normal use of the device. The data derived directly from the device sensors will herein be referred to as raw sensory data.


For example, if interacting with a touchscreen device, the touch time, touch timing, touch pressure, touch area and touch location coordinates can be detected and recorded. Other non-limiting examples of raw sensory data include accelerometer data, gyroscopic data, GPS co-ordinates and hover co-ordinates.


Raw sensory data need not only be collected when the user is consciously interacting with the device. For example, if the device is a mobile device, then raw sensory data may be collected while the device is in the user's pocket to provide information about, for example, the user's walking style or how they sit or stand. As raw sensory data can be collected continuously by the device (whether during operation by the user or not), the monitoring of the user's sensory data is continuous and invisible to user, thus the verification of the identity of the user can similarly be continuous and invisible to the user. In contrast, the use of a PIN or fingerprints can only verify identity at a point in time.


The raw sensory data associated with the user's interaction with the device is then passed to a pre-processor (or processor), where it is converted into derived sensory data. Derived sensory data comprises a set of features that can be calculated or derived from the raw sensory data, but which features may not be determined directly by the device's sensors. Non-sensory data derived from other sources (such as the internet) may also be combined with the raw sensory data to generate derived sensory data or used as raw sensory data, depending on the implementation details of the method.


Features derived from raw sensory data can include: a duration of touchscreen interaction; a physical touchscreen interaction distance; a time between touchscreen interactions; maximum, minimum and/or average deviation from a straight line during a touchscreen interaction; acceleration and/or deceleration of a touchscreen interaction; curvature of a touchscreen interaction; length of a touchscreen interaction (all derived from touchscreen associated data); background tremor while using the device; tremor during the interaction with the device (both derived from accelerometer and gyroscope data); device (and therefore user) movement speed (derived from device GPS coordinates and/or other device location service or services); and the orientation of the computing device (derived from magnetometer data). Many other example features are possible to derive from the raw sensory data.


This derived sensory data, along with any usable raw sensory data, is used to generate a data point associated with that particular user interaction. The data point is then added to a user data set, comprising all the data points associated with a given user. If the total number of data points in the set after the new point has been added is fewer than a predetermined number, N, then the process is repeated for further user interactions until the predetermined number of data points has been reached. This data set will form a training set of data for training classifiers for use in authenticating the user identity. It can be stored locally on the user's device and/or in a back end server associated with the provider of the secure service.


The data point comprises a subset of all the raw and derived sensory data. For example, it may include x and y touch coordinates, pressure, typing speed, touch durations and geolocation. Many other examples are possible.


Once the size of the data set has reached a predefined number of data points, a clustering or classification algorithm (herein referred to as algorithm A) is applied to the user data set. The predefined number of data points can be predetermined to be statistically significant or sufficient to allow the classification algorithm to be able to substantially reliably identify the user, or can be chosen adaptively using machine learning techniques. Algorithm A takes the user data set as an input and produces a model of a set of K clusters or classes corresponding to different user behaviours based on a subset of the user data. The subset of data used is predetermined, but alternatively may again be chosen adaptively using machine learning techniques. The optimal number of classes can be chosen using cross validation.


An example of the type of algorithm used as Algorithm A is a K-means clustering algorithm. In this example, initially a set of K random “mean” points is generated. Each of the data points in the data set are assigned to the nearest one of these mean points, based on a metric (for example the Euclidean distance), to form a set of K clusters. A new mean point for each of these clusters is then calculated. The points in the data set are then reassigned to the nearest of these new means to form K new clusters. These two steps are repeated until convergence (in the sense that the clusters no longer change or substantially change between iterations) is achieved. Many other examples are possible, such as Gaussian Mixture Models using an Expectation-Maximisation algorithm.


The clusters identified in this way should correspond to different user activity contexts.


Algorithm A also uses the user data set to develop a classifier C, which can be used to identify further data points as associated with one of the K clusters. This classifier may, for example, be based on a K-nearest neighbour classification, details of which will be outlined below.


Once the data clusters corresponding to different user environments or behaviours have been determined by algorithm A, a second classification algorithm (herein referred to as algorithm B) is applied to the data points within each cluster Ki. Algorithm B trains a separate classifier Ci for each of the K clusters, which distinguishes between a legitimate user and an illegitimate one (an “attacker”). Algorithm B is based on a Random Forest decision-learning tree, but other examples are possible.


It should be noted that algorithm B could be a set of algorithms, for each of the different K clusters.


The output of the enrolment stage is therefore multiple classifiers, or a classifier and a classifier set: a first classifier C for assigning new data points to one of the K clusters identified by algorithm A; and a set of further classifiers {Ci}, each of which is associated with one of the data clusters and is operable to provide authentication of a new data point.


The first classifier and the second classifier set, along with the derived sensory data used to train them, may be stored on the user's computing device, or alternatively on a remote server or group of servers (e.g. a cloud service or cloud) that could, for example, be under the control of the secure service provider. Likewise, the identification of the K clusters and the training of the classifiers can take place either on the user's computing device, or remotely (e.g. within the cloud).


Referring now to FIG. 2, the authentication stage of the method will now be described according to an embodiment of the method.


In the authentication stage of the method, a user interacts with the computing device, generating a data point from sensory data, which is used to authenticate the user's identity.


The authentication stage begins with the user interacting with the computing device while entering authentication data, such as a password or PIN. As the authentication data is entered, raw sensory data, as described above, is generated by sensors associated with the computing device. A pre-processor (or processor) then converts this raw sensory data to derived sensory data, as described above, and generates a data point from it. In this regard, the first steps of the authentication stage are substantially identical to those of the enrolment stage.


Upon generation of the data point, the classifier C (generated by algorithm A in the enrolment stage) is applied to it. This determines which of the K clusters, identified by algorithm A, the data point should be associated with, i.e. which user situation and/or behaviour is the most appropriate to use given the sensory inputs. This can be, for example, achieved by the use of a K-nearest neighbour algorithm, which works by determining the class of the K (not to be confused with the number of clusters) nearest points in the dataset, based on a distance metric (for example the Euclidean distance), and assigning the data point to a cluster based on which cluster contains the largest number of nearest points. It will be appreciated that alternatives to a K-nearest neighbour approach may be used.


Once the first classifier has identified the appropriate cluster, Ki, the corresponding second classifier, Ci, developed by algorithm B in the enrolment stage and associated with the identified cluster, is applied to the data point. This second stage classifier is based on a random forest algorithm, in which the data point is passed through the ensemble of decision trees trained during the enrolment stage. The output of each of these trees is either a pass or a fail. The ratio of the number of passes to the number of fails is used to determine a confidence score that the user identity is correct. The classifier may alternatively be based on a number of other algorithms, including, but not limited to: neural networks; k-nearest neighbours; and naïve Bayes.


The output of the second classifier is a confidence score for the user being an approved user. If this score is above a pre-determined threshold, the user is authenticated as a legitimate, authorised user. The threshold is variable depending on the requirements of the service being accessed by the user, and can be set to prioritise minimising false-positives or false negatives. It adjusts automatically based on the data used to train the classifiers in order to achieve the desired error rates.


The confidence score is output along with the pass/fail authentication result.


If the confidence score does exceed the required threshold, the user may be asked to input more security or authentication data—for example, answering security questions or inputting passwords or PINs.


In an alternative embodiment, the user's interaction with the device is used as described above to authenticate the user identity, without the user being required to input any security data. The user sensory data is collected and monitored in the background on the device and, when a user would normally be required to login to a service to perform authentication, as long as the behavioural biometrics continues to classify the user as the authorised user the user will not be required to provide any password, PIN or fingerprint (for example during an online check out process when purchasing goods or services over the Internet) as any authentication is performed in the background. Should the behavioural biometrics fail to continue to classify the user as the authorised user, the user will be asked to enter further security information.


A user may interact with a secure service through multiple different devices or types of device. When this is the case, the total user data sets associated with the user will be partitioned into subsets based on the device identity before algorithm A is applied. Enrolment is essentially performed for each device type individually.


The computations associated with the enrolment and authentication stages (i.e. the application of algorithms A and B in enrolment and of the classifiers in authentication) can be performed either on the user computing device itself, or on a server or servers associated with the provider of the secure service, or in combination between the device and server or servers.


There are four examples presented here:


In the first example, all computation is performed on the server. The raw and/or derived sensory data derived from the computing device's sensors is transmitted to the server across a network, where the algorithms or classifiers are applied to it. The classifiers trained in the enrolment stage are stored on the server.


In the second example, all the computations are performed on the user device. The classifiers are stored locally on the user computing device. An authentication message is sent to the server upon the computing device authenticating the user identity, allowing the user access to the secure service.


The third example splits the computation between the server and the computer. The enrolment stage computations are performed on the server to train the classifiers. These are then transmitted to the user computing device and stored locally. The user computing device applies the classifiers to the user data point being used for authentication and transmits a message to the server indicating success or failure. This combination is advantageous in the cases when the user computing device has limited processing power, or when the data set is very large.


The fourth example performs the enrolment stage calculations on the user computing device and then transmits the trained classifiers to the server. The authentication stage calculations are then performed by the server when it receives sensory data (or a data point) transmitted to it across a network by the computing device.


If the user is authenticated then the newly authenticated data point can be added to the set data points for use in a future enrolment stage update. Every time a pre-defined number of new data points, M, are added to the data set the model generated by the enrolment stage is updated. This can be done continuously by using the current models as a starting point, or the models can be regenerated completely.


Any system feature as described herein may also be provided as a method feature, and vice versa. As used herein, means plus function features may be expressed alternatively in terms of their corresponding structure.


Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to system aspects, and vice versa. Furthermore, any, some and/or all features in one aspect can be applied to any, some and/or all features in any other aspect, in any appropriate combination.


It should also be appreciated that particular combinations of the various features described and defined in any aspects of the invention can be implemented and/or supplied and/or used independently.

Claims
  • 1. A method comprising: (a) monitoring touch-screen gestures of a user, that interacts with an online service via a touch-screen of a computing device; and monitoring acceleration and tilt of said computing device, during said touch-screen gestures;(b) extracting data-points from information monitored in step (a);(c) generating a user-specific profile that indicates a characterizing acceleration and a characterizing tilt of the computing device during said touch-screen gestures;(d) subsequently, monitoring touch-screen gestures during access to said online service, and monitoring accompanying acceleration and tilt;(e) extracting data-points from information monitored in step (d);(f) analyzing the data-points extracted in step (e), and checking whether said data-points match said user-specific profile generated in step (c);(g) if the checking of step (f) has a negative result, then: generating a notification that a user that interacted in step (d) to access said online service, is different from the user that interacted in step (a) to access said online service; wherein the generating of step (g) is further based on:determining that a first characteristic of landscape/portrait device orientation during touch-screen gestures monitored on the computing device in step (a), is different from a second characteristic of landscape/portrait device orientation during touch-screen gestures monitored on the computing device in step (d).
  • 2. The method of claim 1, wherein the computing device is a device selected from the group consisting of:a laptop computer, a personal computer, a desktop computer, a mobile computing device, a mobile telephone, a tablet.
  • 3. The method of claim 1, comprising: utilizing user interactions with the computing device to authenticate the user identity without requiring the user to input any security data, by collecting and monitoring sensory data in the background on the computing device.
  • 4. The method of claim 1, comprising: utilizing user interactions with the computing device to authenticate the user identity without requiring the user to input any security data, by collecting and monitoring sensory data in the background on the computing device;wherein, if behavioral biometrics continue to classify the user as an authorized user, then the method does not require the user to provide any password or PIN or fingerprint and the method comprises performing user authentication in the background;wherein, if said behavioral biometrics fail to continue to classify the user as the authorized user, then the method comprises asking the user to enter further security information.
  • 5. The method of claim 1, wherein the extracting of data-points, in at least one of step (c) and step (e), comprises:deriving features from sensory data that includes one or more of: duration of touch-screen interaction, physical distance of touch-screen interaction, time between touch-screen interactions, deviation from a straight line during touch-screen interaction, acceleration of touch-screen interaction, deceleration of touch-screen interaction, curvature of touch-screen interaction, length of touch-screen interaction.
  • 6. The method of claim 1, wherein the extracting of data-points, in at least one of step (c) and step (e), comprises:deriving features from sensory data that includes one or more of: duration of touch-screen interaction, physical distance of touch-screen interaction, time between touch-screen interactions, deviation from a straight line during touch-screen interaction, acceleration of touch-screen interaction, deceleration of touch-screen interaction, curvature of touch-screen interaction, length of touch-screen interaction.
  • 7. The method of claim 1, wherein the extracting of data-points, in at least one of step (c) and step (e),is performed until data-points reach a pre-defined number of data-points that is determined to be statistically significant for performing classification that reliably identifies the user.
  • 8. The method of claim 1, wherein the extracting of data-points, in at least one of step (c) and step (e),is performed until data-points reach a number of data-points that is chosen adaptively by a machine learning technique.
  • 9. The method of claim 1, wherein monitoring touch-screen gestures and monitoring acceleration and tilt,in at least one of step (a) and step (d),are performed while the user is entering user-authentication data which includes at least one of: a password, a Personal Identification Number (PIN).
  • 10. The method of claim 1, wherein monitoring touch-screen gestures and monitoring acceleration and tilt,in at least one of step (a) and step (d),are performed during an online check-out process of purchasing goods or services over the Internet.
  • 11. The method of claim 1, wherein generating the user-specific profile in step (c) is performed locally by a processor of said computing device.
  • 12. The method of claim 1, wherein generating the user-specific profile in step (c) is performed on a remote server which receives from the computing device at least one of: raw sensory data, derived sensory data.
  • 13. A method comprising: (a) monitoring touch-screen gestures of a user, that interacts with an online service via a touch-screen of a computing device; and monitoring acceleration and tilt of said computing device, during said touch-screen gestures;(b) extracting data-points from information monitored in step (a);(c) generating a user-specific profile that indicates a characterizing acceleration and a characterizing tilt of the computing device during said touch-screen gestures;(d) subsequently, monitoring touch-screen gestures during access to said online service, and monitoring accompanying acceleration and tilt;(e) extracting data-points from information monitored in step (d);(f) analyzing the data-points extracted in step (e), and checking whether said data-points match said user-specific profile generated in step (c);(g) if the checking of step (f) has a negative result, then: generating a notification that a user that interacted in step (d) to access said online service, is different from the user that interacted in step (a) to access said online service; wherein the generating of step (g) is comprises:generating a user-specific profile that is based on at least:(I) a characteristic of background tremor of the computing device during monitored touch-screen gestures, and(II) a characteristic of curvature of monitored touch-screen gestures, and(III) a characteristic of landscape/portrait device orientation during monitored touch-screen gestures.
  • 14. The method of claim 13, wherein the computing device is a device selected from the group consisting of:a laptop computer, a personal computer, a desktop computer, a mobile computing device, a mobile telephone, a tablet.
  • 15. The method of claim 13, comprising: utilizing user interactions with the computing device to authenticate the user identity without requiring the user to input any security data, by collecting and monitoring sensory data in the background on the computing device.
  • 16. The method of claim 13, comprising: utilizing user interactions with the computing device to authenticate the user identity without requiring the user to input any security data, by collecting and monitoring sensory data in the background on the computing device;wherein, if behavioral biometrics continue to classify the user as an authorized user, then the method does not require the user to provide any password or PIN or fingerprint and the method comprises performing user authentication in the background;wherein, if said behavioral biometrics fail to continue to classify the user as the authorized user, then the method comprises asking the user to enter further security information.
  • 17. A method comprising: (a) monitoring touch-screen gestures of a user, that interacts with an online service via a touch-screen of a computing device; and monitoring acceleration and tilt of said computing device, during said touch-screen gestures;(b) generating a user-specific profile that is based on at least: (i) device tremor during user interactions, and (ii) device orientation as portrait or landscape during user interactions;(c) subsequently, monitoring touch-screen gestures and monitoring acceleration and tilt; and checking whether touch-screen gestures and acceleration and tilt match the user-specific profile generated in step (b);(d) if the checking of step (c) has a negative result, then: generating a notification that a user that interacted in step (c) to access said online service, is different from the user that interacted in step (a) to access said online service.
  • 18. The method of claim 17, wherein the computing device is a device selected from the group consisting of:a laptop computer, a personal computer, a desktop computer, a mobile computing device, a mobile telephone, a tablet.
  • 19. The method of claim 17, comprising: utilizing user interactions with the computing device to authenticate the user identity without requiring the user to input any security data, by collecting and monitoring sensory data in the background on the computing device.
  • 20. The method of claim 17, comprising: utilizing user interactions with the computing device to authenticate the user identity without requiring the user to input any security data, by collecting and monitoring sensory data in the background on the computing device;wherein, if behavioral biometrics continue to classify the user as an authorized user, then the method does not require the user to provide any password or PIN or fingerprint and the method comprises performing user authentication in the background;wherein, if said behavioral biometrics fail to continue to classify the user as the authorized user, then the method comprises asking the user to enter further security information.
  • 21. A system comprising: a hardware processor which is configured to
  • 22. The system of claim 21, wherein the system is configured to utilize user interactions with the computing device to authenticate the user identity without requiring the user to input any security data, by collecting and monitoring sensory data in the background on the computing device.
  • 23. A system comprising: a hardware processor that is adapted to
  • 24. The system of claim 23, wherein the system is configured to utilize user interactions with the computing device to authenticate the user identity without requiring the user to input any security data, by collecting and monitoring sensory data in the background on the computing device.
  • 25. A system comprising: a hardware processor that is configured to
  • 26. The system of claim 25, wherein the system is configured to utilize user interactions with the computing device to authenticate the user identity without requiring the user to input any security data, by collecting and monitoring sensory data in the background on the computing device.
Priority Claims (1)
Number Date Country Kind
1511230 Jun 2015 GB national
CROSS-REFERENCE TO RELATED APPLICATIONS

This patent application is a Continuation of U.S. Ser. No. 15/192,845, filed on Jun. 24, 2016, which is hereby incorporated by reference in its entirety; which claims priority from Great Britain Patent Application GB 1511230.3, filed on Jun. 25, 2015, which is hereby incorporated by reference in its entirety.

US Referenced Citations (513)
Number Name Date Kind
3618019 Nemirovsky Nov 1971 A
3699517 Dyche Oct 1972 A
3983535 Herbst Sep 1976 A
4128829 Herbst Dec 1978 A
4621334 Garcia Nov 1986 A
4760386 Heath Jul 1988 A
4805222 Young Feb 1989 A
5305238 Starr Apr 1994 A
5442342 Kung Aug 1995 A
5485171 Copper Jan 1996 A
5557686 Brown Sep 1996 A
5565657 Merz Oct 1996 A
5581261 Hickman Dec 1996 A
5838306 O'Connor Nov 1998 A
5874941 Yamada Feb 1999 A
5999162 Takahashi Dec 1999 A
6202023 Hancock Mar 2001 B1
6337686 Wong Jan 2002 B2
6337919 Dunton Jan 2002 B1
6442692 Zilberman Aug 2002 B1
6572014 Lambert Jun 2003 B1
6819219 Bolle Nov 2004 B1
6836554 Bolle Dec 2004 B1
6895514 Kermani May 2005 B1
6931131 Becker Aug 2005 B1
6938061 Rumynin Aug 2005 B1
6938159 O'Connor Aug 2005 B1
6957185 Labaton Oct 2005 B1
6957186 Guheen Oct 2005 B1
6983061 Ikegami Jan 2006 B2
7092926 Cerrato Aug 2006 B2
7130452 Bolle Oct 2006 B2
7133792 Murakami Nov 2006 B2
7139916 Billingsley Nov 2006 B2
7158118 Liberty Jan 2007 B2
7236156 Liberty Jun 2007 B2
7245218 Ikehara Jul 2007 B2
7366919 Sobel Apr 2008 B1
7395436 Nemovicher Jul 2008 B1
7494061 Reinhold Feb 2009 B2
7523191 Thomas Apr 2009 B1
7535456 Liberty May 2009 B2
7606915 Calinov Oct 2009 B1
7796013 Murakami Sep 2010 B2
7818290 Davis Oct 2010 B2
7860870 Sadagopan Dec 2010 B2
8031175 Rigazio Oct 2011 B2
8065624 Morin Nov 2011 B2
8125312 Orr Feb 2012 B2
8156324 Shnowske Apr 2012 B1
8170953 Tullis May 2012 B1
8171085 Tevanian, Jr. May 2012 B1
8201222 Inoue Jun 2012 B2
8244211 Clark Aug 2012 B2
8285658 Kellas-Dicks Oct 2012 B1
8417960 Takahashi Apr 2013 B2
8433785 Awadallah Apr 2013 B2
8449393 Sobel May 2013 B2
8499245 Froment Jul 2013 B1
8510113 Conkie Aug 2013 B1
8548208 Schultz Oct 2013 B2
8549629 Mccreesh Oct 2013 B1
8555077 Davis Oct 2013 B2
8621209 Johansson Dec 2013 B1
8745729 Poluri Jun 2014 B2
8788838 Fadell Aug 2014 B1
8803797 Scott Aug 2014 B2
8819812 Weber Aug 2014 B1
8832823 Boss Sep 2014 B2
8838060 Walley Sep 2014 B2
8880441 Chen Nov 2014 B1
8898787 Thompson Nov 2014 B2
8904479 Johansson Dec 2014 B1
8938787 Turgeman Jan 2015 B2
8941466 Bayram Jan 2015 B2
8990959 Zhu Mar 2015 B2
9069942 Turgeman Jun 2015 B2
9071969 Turgeman Jun 2015 B2
9154534 Gayles Oct 2015 B1
9174123 Nasiri Nov 2015 B2
9195351 Rosenberg Nov 2015 B1
9203860 Casillas Dec 2015 B1
9275337 Turgeman Mar 2016 B2
9282112 Filatov Mar 2016 B2
9301140 Costigan Mar 2016 B1
9304915 Adams Apr 2016 B2
9355231 Disraeli May 2016 B2
9355234 Magi Shaashua May 2016 B1
9418221 Turgeman Aug 2016 B2
9430629 Ziraknejad Aug 2016 B1
9450971 Turgeman Sep 2016 B2
9477826 Turgeman Oct 2016 B2
9483292 Turgeman Nov 2016 B2
9526006 Turgeman Dec 2016 B2
9529987 Deutschmann Dec 2016 B2
9531701 Turgeman Dec 2016 B2
9531733 Turgeman Dec 2016 B2
9536071 Turgeman Jan 2017 B2
9541995 Turgeman Jan 2017 B2
9547766 Turgeman Jan 2017 B2
9552470 Turgeman Jan 2017 B2
9558339 Turgeman Jan 2017 B2
9589120 Samuel Mar 2017 B2
9621567 Turgeman Apr 2017 B2
9626677 Turgeman Apr 2017 B2
9654485 Neumann May 2017 B1
9665703 Turgeman May 2017 B2
9674218 Turgeman Jun 2017 B2
9690915 Turgeman Jun 2017 B2
9703953 Turgeman Jul 2017 B2
9710316 Chheda Jul 2017 B1
9712558 Turgeman Jul 2017 B2
9747436 Turgeman Aug 2017 B2
9779423 Turgeman Oct 2017 B2
9832192 Alonso Cebrian Nov 2017 B2
9838373 Turgeman Dec 2017 B2
9848009 Turgeman Dec 2017 B2
9927883 Lin Mar 2018 B1
10032010 Turgeman Jul 2018 B2
10037421 Turgeman Jul 2018 B2
10049209 Turgeman Aug 2018 B2
10055560 Turgeman Aug 2018 B2
10069837 Turgeman Sep 2018 B2
10069852 Turgeman Sep 2018 B2
10079853 Turgeman Sep 2018 B2
10083439 Turgeman Sep 2018 B2
10164985 Turgeman Dec 2018 B2
10198122 Turgeman Feb 2019 B2
10262324 Turgeman Apr 2019 B2
10298614 Turgeman May 2019 B2
10395018 Turgeman Aug 2019 B2
10397262 Karabchevsky Aug 2019 B2
10404729 Turgeman Sep 2019 B2
10474815 Turgeman Nov 2019 B2
10476873 Turgeman Nov 2019 B2
10523680 Turgeman Dec 2019 B2
10579784 Turgeman Mar 2020 B2
10586036 Turgeman Mar 2020 B2
10621585 Turgeman Apr 2020 B2
10685355 Novick Jun 2020 B2
10719765 Novik Jul 2020 B2
10728761 Kedem Jul 2020 B2
20010004733 Eldering Jun 2001 A1
20020023229 Hangai Feb 2002 A1
20020089412 Heger Jul 2002 A1
20030033526 French Feb 2003 A1
20030074201 Grashey Apr 2003 A1
20030137494 Tulbert Jul 2003 A1
20030149803 Wilson Aug 2003 A1
20030212811 Thornton Nov 2003 A1
20040015714 Abraham Jan 2004 A1
20040017355 Shim Jan 2004 A1
20040021643 Hoshino Feb 2004 A1
20040034784 Fedronic Feb 2004 A1
20040062423 Doi Apr 2004 A1
20040111523 Hall Jun 2004 A1
20040123156 Hammond Jun 2004 A1
20040128240 Yusin Jul 2004 A1
20040143737 Teicher Jul 2004 A1
20040186882 Ting Sep 2004 A1
20040221171 Ahmed Nov 2004 A1
20050008148 Jacobson Jan 2005 A1
20050060138 Wang Mar 2005 A1
20050179657 Russo Aug 2005 A1
20050289264 Illowsky Dec 2005 A1
20060006803 Huang Jan 2006 A1
20060080263 Willis Apr 2006 A1
20060090073 Steinberg Apr 2006 A1
20060123101 Buccella Jun 2006 A1
20060143454 Walmsley Jun 2006 A1
20060195328 Abraham Aug 2006 A1
20060215886 Black Sep 2006 A1
20060224898 Ahmed Oct 2006 A1
20060238490 Stanley Oct 2006 A1
20060239430 Gue Oct 2006 A1
20060280339 Cho Dec 2006 A1
20060282660 Varghese Dec 2006 A1
20060284969 Kim Dec 2006 A1
20070118804 Raciborski May 2007 A1
20070156443 Gurvey Jul 2007 A1
20070174082 Singh Jul 2007 A1
20070183633 Hoffmann Aug 2007 A1
20070214426 Ruelle Sep 2007 A1
20070226797 Thompson Sep 2007 A1
20070236330 Cho Oct 2007 A1
20070240230 O'Connell Oct 2007 A1
20070241861 Venkatanna Oct 2007 A1
20070250920 Lindsay Oct 2007 A1
20070255821 Ge Nov 2007 A1
20070266305 Cong Nov 2007 A1
20070271466 Mak Nov 2007 A1
20070283416 Renaud Dec 2007 A1
20080046982 Parkinson Feb 2008 A1
20080059474 Lim Mar 2008 A1
20080068343 Hoshino Mar 2008 A1
20080084972 Burke Apr 2008 A1
20080091453 Meehan Apr 2008 A1
20080091639 Davis Apr 2008 A1
20080092209 Davis Apr 2008 A1
20080092245 Alward Apr 2008 A1
20080097851 Bemmel Apr 2008 A1
20080098456 Alward Apr 2008 A1
20080120717 Shakkarwar May 2008 A1
20080136790 Hio Jun 2008 A1
20080162449 Chao-Yu Jul 2008 A1
20080183745 Cancel Jul 2008 A1
20080192005 Elgoyhen Aug 2008 A1
20080200310 Tagliabue Aug 2008 A1
20080211766 Westerman Sep 2008 A1
20080215576 Zhao Sep 2008 A1
20080263636 Gusler Oct 2008 A1
20080298588 Shakkarwar Dec 2008 A1
20080301808 Calo Dec 2008 A1
20080306897 Liu Dec 2008 A1
20080309616 Massengill Dec 2008 A1
20090037983 Chiruvolu Feb 2009 A1
20090038010 Ma Feb 2009 A1
20090049555 Cho Feb 2009 A1
20090089879 Wang Apr 2009 A1
20090094311 Awadallah Apr 2009 A1
20090132395 Lam May 2009 A1
20090133106 Bentley May 2009 A1
20090134972 Wu, Jr. May 2009 A1
20090157792 Fiatal Jun 2009 A1
20090172551 Kane Jul 2009 A1
20090189736 Hayashi Jul 2009 A1
20090199296 Xie Aug 2009 A1
20090203355 Clark Aug 2009 A1
20090227232 Matas Sep 2009 A1
20090241188 Komura Sep 2009 A1
20090254336 Dumais Oct 2009 A1
20090281979 Tysowski Nov 2009 A1
20090293119 Jonsson Nov 2009 A1
20090300589 Watters Dec 2009 A1
20090303204 Nasiri Dec 2009 A1
20090320123 Yu Dec 2009 A1
20100007632 Yamazaki Jan 2010 A1
20100040293 Hermann Feb 2010 A1
20100042387 Gibbon Feb 2010 A1
20100042403 Chandrasekar Feb 2010 A1
20100046806 Baughman Feb 2010 A1
20100070405 Joa Mar 2010 A1
20100077470 Kozat Mar 2010 A1
20100082747 Yue Apr 2010 A1
20100082998 Kohavi Apr 2010 A1
20100097324 Anson Apr 2010 A1
20100115610 Tredoux May 2010 A1
20100122082 Deng May 2010 A1
20100125816 Bezos May 2010 A1
20100138370 Wu Jun 2010 A1
20100164897 Morin Jul 2010 A1
20100171753 Kwon Jul 2010 A1
20100197352 Runstedler Aug 2010 A1
20100203876 Krishnaswamy Aug 2010 A1
20100225443 Bayram Sep 2010 A1
20100245553 Schuler Sep 2010 A1
20100269165 Chen Oct 2010 A1
20100281539 Bums Nov 2010 A1
20100284532 Burnett Nov 2010 A1
20100287229 Hauser Nov 2010 A1
20100321304 Rofougaran Dec 2010 A1
20100328074 Johnson Dec 2010 A1
20110010209 McNally Jan 2011 A1
20110012829 Yao Jan 2011 A1
20110016320 Bergsten Jan 2011 A1
20110016534 Jakobsson Jan 2011 A1
20110018828 Wu Jan 2011 A1
20110023115 Wright Jan 2011 A1
20110029902 Bailey Feb 2011 A1
20110039529 Kim Feb 2011 A1
20110039602 McNamara Feb 2011 A1
20110043475 Rigazio Feb 2011 A1
20110050394 Zhang Mar 2011 A1
20110055077 French Mar 2011 A1
20110063211 Hoerl Mar 2011 A1
20110065504 Dugan Mar 2011 A1
20110066682 Aldunate Mar 2011 A1
20110082768 Eisen Apr 2011 A1
20110102570 Wilf May 2011 A1
20110105103 Ullrich May 2011 A1
20110105859 Popovic May 2011 A1
20110113388 Eisen May 2011 A1
20110154273 Aburada Jun 2011 A1
20110154497 Bailey Jun 2011 A1
20110159650 Shiraishi Jun 2011 A1
20110159850 Faith Jun 2011 A1
20110162076 Song Jun 2011 A1
20110191820 Ivey Aug 2011 A1
20110193737 Chiueh Aug 2011 A1
20110202453 Issa Aug 2011 A1
20110221684 Rydenhag Sep 2011 A1
20110223888 Esaki Sep 2011 A1
20110225644 Pullikottil Sep 2011 A1
20110246902 Tsai Oct 2011 A1
20110248941 Abdo Oct 2011 A1
20110251823 Davis Oct 2011 A1
20110271342 Chung Nov 2011 A1
20110276414 Subbarao Nov 2011 A1
20110286730 Gallagher Nov 2011 A1
20110300831 Chin Dec 2011 A1
20110304531 Brooks Dec 2011 A1
20110320822 Lind Dec 2011 A1
20120005483 Patvarczki Jan 2012 A1
20120005719 McDougal Jan 2012 A1
20120007821 Zaliva Jan 2012 A1
20120054834 King Mar 2012 A1
20120096555 Mahaffey Apr 2012 A1
20120102551 Bidare Apr 2012 A1
20120113061 Ikeda May 2012 A1
20120123932 LeCuyer May 2012 A1
20120124662 Baca May 2012 A1
20120133055 Machida May 2012 A1
20120151559 Koudys Jun 2012 A1
20120154173 Chang Jun 2012 A1
20120154273 McDade Jun 2012 A1
20120154823 Sakamoto Jun 2012 A1
20120158503 Mardikar Jun 2012 A1
20120159599 Szoke Jun 2012 A1
20120164978 Conti Jun 2012 A1
20120167170 Shi Jun 2012 A1
20120167204 Akka Jun 2012 A1
20120174213 Geiger Jul 2012 A1
20120188198 Jeong Jul 2012 A1
20120204257 O'Connell Aug 2012 A1
20120218193 Weber Aug 2012 A1
20120239557 Weinflash Sep 2012 A1
20120246737 Paxton Sep 2012 A1
20120252410 Williams Oct 2012 A1
20120278804 Narayanasamy Nov 2012 A1
20120284380 Anderson Nov 2012 A1
20120297476 Zeljkovic Nov 2012 A1
20130018796 Kolhatkar Jan 2013 A1
20130024239 Baker Jan 2013 A1
20130036416 Raju Feb 2013 A1
20130061169 Pearcy Mar 2013 A1
20130076650 Vik Mar 2013 A1
20130088434 Masuda Apr 2013 A1
20130097682 Zeljkovic Apr 2013 A1
20130097706 Titonis Apr 2013 A1
20130109944 Sparacino May 2013 A1
20130111586 Jackson May 2013 A1
20130132091 Skerpac May 2013 A1
20130133055 Ali May 2013 A1
20130135218 Jain May 2013 A1
20130139248 Rhee May 2013 A1
20130154999 Guard Jun 2013 A1
20130162603 Peng Jun 2013 A1
20130167212 Azar Jun 2013 A1
20130198832 Draluk Aug 2013 A1
20130212674 Boger Aug 2013 A1
20130237272 Prasad Sep 2013 A1
20130239195 Turgeman Sep 2013 A1
20130239206 Draluk Sep 2013 A1
20130243208 Fawer Sep 2013 A1
20130254642 Seo Sep 2013 A1
20130282637 Costigan Oct 2013 A1
20130288647 Turgeman Oct 2013 A1
20130305357 Ayyagari Nov 2013 A1
20130312097 Turnbull Nov 2013 A1
20130335349 Ferren Dec 2013 A1
20130346309 Giori Dec 2013 A1
20140033317 Barber Jan 2014 A1
20140041020 Zhao Feb 2014 A1
20140078061 Simons Mar 2014 A1
20140078193 Bamhoefer Mar 2014 A1
20140082369 Waclawsky Mar 2014 A1
20140111451 Park Apr 2014 A1
20140114843 Klein Apr 2014 A1
20140118520 Slaby May 2014 A1
20140123275 Azar May 2014 A1
20140143304 Hegarty May 2014 A1
20140168093 Lawrence Jun 2014 A1
20140196119 Hill Jul 2014 A1
20140200953 Mun Jul 2014 A1
20140223531 Outwater Aug 2014 A1
20140244499 Gruner Aug 2014 A1
20140250538 Rapaport Sep 2014 A1
20140259130 Li Sep 2014 A1
20140270571 Dwan Sep 2014 A1
20140283059 Sambamurthy Sep 2014 A1
20140283068 Call Sep 2014 A1
20140289833 Briceno Sep 2014 A1
20140317028 Turgeman Oct 2014 A1
20140317726 Turgeman Oct 2014 A1
20140317734 Valencia Oct 2014 A1
20140317744 Turgeman Oct 2014 A1
20140325223 Turgeman Oct 2014 A1
20140325645 Turgeman Oct 2014 A1
20140325646 Turgeman Oct 2014 A1
20140325682 Turgeman Oct 2014 A1
20140337786 Luo Nov 2014 A1
20140344927 Turgeman Nov 2014 A1
20150002479 Kawamura Jan 2015 A1
20150012920 De Santis Jan 2015 A1
20150062078 Christman Mar 2015 A1
20150081549 Kimberg Mar 2015 A1
20150091858 Rosenberg Apr 2015 A1
20150094030 Turgeman Apr 2015 A1
20150101031 Harjanto Apr 2015 A1
20150128252 Konami May 2015 A1
20150146945 Han May 2015 A1
20150205944 Turgeman Jul 2015 A1
20150205955 Turgeman Jul 2015 A1
20150205957 Turgeman Jul 2015 A1
20150205958 Turgeman Jul 2015 A1
20150212843 Turgeman Jul 2015 A1
20150213244 Lymberopoulos Jul 2015 A1
20150213245 Tartz Jul 2015 A1
20150213246 Turgeman Jul 2015 A1
20150213251 Turgeman Jul 2015 A1
20150242601 Griffiths Aug 2015 A1
20150256528 Turgeman Sep 2015 A1
20150256556 Kaminsky Sep 2015 A1
20150264572 Turgeman Sep 2015 A1
20150268768 Woodhull Sep 2015 A1
20150279155 Chun Oct 2015 A1
20150310196 Turgeman Oct 2015 A1
20150348038 Femrite Dec 2015 A1
20150358317 Deutschmann Dec 2015 A1
20160006800 Summers Jan 2016 A1
20160012465 Sharp Jan 2016 A1
20160019546 Eisen Jan 2016 A1
20160034673 Chandra Feb 2016 A1
20160042164 Goldsmith Feb 2016 A1
20160048937 Mathura Feb 2016 A1
20160055324 Agarwal Feb 2016 A1
20160057623 Dutt Feb 2016 A1
20160077620 Choi Mar 2016 A1
20160087952 Tartz Mar 2016 A1
20160109969 Keating Apr 2016 A1
20160132105 Turgeman May 2016 A1
20160155126 D'Uva Jun 2016 A1
20160164905 Pinney Wood Jun 2016 A1
20160164906 Pinney Wood Jun 2016 A1
20160174044 Jones Jun 2016 A1
20160179245 Johansson Jun 2016 A1
20160182503 Cheng Jun 2016 A1
20160191237 Roth Jun 2016 A1
20160196414 Stuntebeck Jul 2016 A1
20160197918 Turgeman Jul 2016 A1
20160209948 Tulber Jul 2016 A1
20160226865 Chen Aug 2016 A1
20160241555 Vo Aug 2016 A1
20160294837 Turgeman Oct 2016 A1
20160300049 Guedalia Oct 2016 A1
20160300054 Turgeman Oct 2016 A1
20160306974 Turgeman Oct 2016 A1
20160307191 Turgeman Oct 2016 A1
20160307201 Turgeman Oct 2016 A1
20160321445 Turgeman Nov 2016 A1
20160321689 Turgeman Nov 2016 A1
20160328572 Valacich Nov 2016 A1
20160342826 Apostolos Nov 2016 A1
20160344783 Kushimoto Nov 2016 A1
20160364138 Luo Dec 2016 A1
20160366177 Turgeman Dec 2016 A1
20160371476 Turgeman Dec 2016 A1
20170011217 Turgeman Jan 2017 A1
20170012988 Turgeman Jan 2017 A1
20170017781 Turgeman Jan 2017 A1
20170032114 Turgeman Feb 2017 A1
20170034210 Talmor Feb 2017 A1
20170048272 Yamamura Feb 2017 A1
20170054702 Turgeman Feb 2017 A1
20170063858 Bandi Mar 2017 A1
20170076089 Turgeman Mar 2017 A1
20170085587 Turgeman Mar 2017 A1
20170090418 Tsang Mar 2017 A1
20170091450 Turgeman Mar 2017 A1
20170126735 Turgeman May 2017 A1
20170127197 Mulder May 2017 A1
20170140279 Turgeman May 2017 A1
20170149958 Xian May 2017 A1
20170154366 Turgeman Jun 2017 A1
20170177999 Novik Jun 2017 A1
20170193526 Turgeman Jul 2017 A1
20170195354 Kesin Jul 2017 A1
20170195356 Turgeman Jul 2017 A1
20170221064 Turgeman Aug 2017 A1
20170302340 Berlin Oct 2017 A1
20170364674 Grubbs Dec 2017 A1
20170364919 Ranganath Dec 2017 A1
20180012003 Asulin Jan 2018 A1
20180012227 Tunnell Jan 2018 A1
20180034850 Turgeman Feb 2018 A1
20180046792 Toqan Feb 2018 A1
20180095596 Turgeman Apr 2018 A1
20180097841 Stolarz Apr 2018 A1
20180103047 Turgeman Apr 2018 A1
20180107836 Boger Apr 2018 A1
20180115899 Kedem Apr 2018 A1
20180121640 Turgeman May 2018 A1
20180160309 Turgeman Jun 2018 A1
20180314816 Turgeman Nov 2018 A1
20180349583 Turgeman Dec 2018 A1
20180350144 Rathod Dec 2018 A1
20180351959 Turgeman Dec 2018 A1
20180373780 Pascarella Dec 2018 A1
20190028497 Karabchevsky Jan 2019 A1
20190057200 Sabag Feb 2019 A1
20190121956 Turgeman Apr 2019 A1
20190124068 Anders Apr 2019 A1
20190156034 Kedem May 2019 A1
20190158535 Kedem May 2019 A1
20190220863 Novick Jul 2019 A1
20190236391 Novik Aug 2019 A1
20190272025 Turgeman Sep 2019 A1
20190342328 Rivner Nov 2019 A1
20190342329 Turgeman Nov 2019 A1
20200012770 Turgeman Jan 2020 A1
20200045044 Turgeman Feb 2020 A1
20200076816 Turgeman Mar 2020 A1
20200234306 Turgeman Jul 2020 A1
Foreign Referenced Citations (16)
Number Date Country
2410450 Jan 2012 EP
2477136 Jul 2012 EP
2541452 Jan 2013 EP
2610776 Jul 2013 EP
2646904 Aug 2018 EP
3019991 Feb 2019 EP
2338092 May 2010 ES
2005099166 Oct 2005 WO
2007146437 Dec 2007 WO
2012001697 Jan 2012 WO
2012073233 Jun 2012 WO
2013161077 Oct 2013 WO
2015127253 Aug 2015 WO
2018007821 Jan 2018 WO
2018007823 Jan 2018 WO
2018055406 Mar 2018 WO
Non-Patent Literature Citations (44)
Entry
Communication from the European Patent Office (EPO) in EP 14814408, dated Oct. 15, 2019.
Bassam Sayed, “A Static Authentication Framework Based on Mouse Gesture Dynamics”, Helwan University, 2003.
Communication from the European Patent Office (EPO) in EP 17739666, dated Jun. 17, 2020.
International Search Report for PCT international application PCT/IL2018/051246, dated Mar. 11, 2019.
Written Opinion of the International Searching Authority for PCT international application PCT/IL2018/051246, dated Mar. 11, 2019.
Written Opinion of the International Searching Authority for PCT international application PCT/IL2011/000907, dated Apr. 19, 2012.
Written Opinion of the International Searching Authority for PCT international application PCT/IB2014/062293, dated Oct. 1, 2014.
Written Opinion of the International Searching Authority for PCT international application PCT/IB2014/062941, dated Dec. 17, 2014.
Written Opinion of the International Searching Authority for PCT international application PCT/IB2016/054064, dated Jul. 9, 2015.
Syed Ahsan Abbas et al., “What is the difference between a rooted and unrooted Android?” Quora.com, dated Jul. 22, 2016, printed on Aug. 12, 2019 from: www.Quora.com/What-is-the-difference-between-a-rooted-and-unrooted-Android.
Sebastian Lindstrom, “Getting to know asynchronous JavaScript: Callbacks, Promises and Async / Await”, Medium.com, dated Jul. 2, 2017, printed on Aug. 12, 2019 from: Medium.com/codebuddies/getting-to-know-asynchronous-javascript-callbacks-promises-and-async-await-17e0673281ee.
Machine translation of WO 2013/161077 A1, “Biometric authentication device, biometric authentication program, and biometric authentication method”, Obtained on Jan. 24, 2020 from: https://patents.google.com/patent/WO2013161077A1/en?oq=JP2006277341A.
Oriana Riva et al., “Progressive authentication: Deciding when to authenticate on mobile phones”, USENIX Security Symposium 2012.
Ahmed et al., “A New Biometric Technology Based on Mouse Dynamics”, Jul.-Sep. 2007, IEEE Transactions on Dependable and Secure Computing, vol. 4, No. 3, pp. 165-179.
Bailey, Kyle O., “Computer Based Behavioral Biometric Authentication Via Multi-Modal Fusion”, Thesis, 2013, Air Force Insitute of Technology.
Elizabeth Stinson and John C. Mitchell, “Characterizing the Remote Control Behavior of Bots”, Detection of Intrusions and Malware, and Vulnerability Assessment. Springer Berlin Heidelberg, p. 89-108. Dec. 31, 2007.
Todorov, “Optimality Principles in Sensorimotor Control (Review)”, Sep. 2004, Nature Neuroscience 7, pp. 907-915.
Cleeff et al., “Security Implications of Virtualization: A Literature Study”, Science and Engineering, 2009.
Hibbeln et al., “Detecting Deception in Online Environments: Measuring Fraud Through Mouse Cursor Movements”, Jun. 7, 2014, Gmunden Retreat on NeuroIS 2014 Gmunden Austria, p. 38.
Ferrie Peter, “Attack on Virtual Machine Emulators”, Symantec Technology Exchange, 2007.
Yampolskiy et al., “Behavioural Biometrics: a survey and classification”, 2008, International Journal of Biometrics, vol. 1, No. 1, pp. 81-113.
Provos et al., 2007, “The Ghost in the Browser: Analysis of Web-based Malware”.
Huang Yao-Wen et al., “Web application security assessment by fault injection and behavior monitoring”, 2003, Proceedings of the 12th international conference on World Wide Web, ACM.
Ben Hansen, “The Blur Busters Mouse Guide”, dated Feb. 1, 2014; printed from the Internet on Aug. 5, 2019 from: https://www.blurbusters.com/faq/mouse-guide/.
Chris Cain, “Analyzing Man-in-the-Browser (MITB) Attacks”, dated Dec. 2014; downloaded from the Internet on Aug. 5, 2019 from: https://www.sans.org/reading-room/whitepapers/forensics/analyzing-man-in-the-browser-mitb-attacks-35687.
Faisal Alkhateeb et al., “Bank Web Sites Phishing Detection and Notification System Based on Semantic Web technologies”, International Journal of Security and its Applications 6(4):53-66, Oct. 2012.
Sungzoon Cho et al., “Artificial Rhythms and Cues for Keystroke Dynamics Based Authentication”, International Conference on Biometrics (ICB)—Advances in Biometrics, pp. 626-632, year 2006.
International Search Report for PCT/IB2017/055995, dated Feb. 15, 2018.
Written Opinion of the International Search Authority for PCT/IB2017/055995, dated Feb. 15, 2018.
Supplementary European Search Report for Application 11844440 dated Nov. 17, 2017.
International Search Report for application PCT/IB2016/054064 dated Nov. 21, 2016.
International Search Report for application PCT/IB2014/062941 dated Dec. 17, 2014.
International Search Report for application PCT/IB2014/062293 dated Oct. 1, 2014.
International Search Report for application PCT/IL2011/000907 dated Apr. 19, 2012.
Nakkabi et al., “Improving Mouse Dynamics Biometric Performance Using Variance Reduction via Extractors with Separate Features”, Nov. 2010, IEEE Transactions on System, Man, and Cybernetics; vol. 40, No. 6.
Nance et al., “Virtual Machine Introspection”, IEEE Security & Privacy, 2008.
Garfinkel and Rosenblum, “A virtual Machine Introspection-Based Architecture for Intrusion Detection.”, 2003, Proc. Network and Distributed Systems Security Symp., The Internet Society, pp. 191-206.
Spafford et al., “Software Forensics: Can We Track Code to its Authors?”, Feb. 1992, Computer Science Technical Report, Purdue e-Pubs, Report No. CSD-TR-92-010.
Tavis Ormandy, “An Empirical Study into the Security Exposure to Hosts of Hostile Virtualized Environments”, retrieved from the Internet on May 3, 2017, from: http://taviso.decsystem.org/virtsec.pdf.
Zheng et al., “An Efficient User Verification System via Mouse Movements”, Oct. 17-21, 2011, CCS'11, Chicago, Illinois.
Liston et al., “On the Cutting Edge: Thwarting Virtual Machine Detection”; retrieved from the Internet on May 3, 2017, from: http://docplayer.net/9791309-On-the-cutting-edge-thwarting-virtual-machine-detection.html.
Georgia Frantzeskou et al., “Identifying Authorship by Byte-Level N-Grams: The source Code Author Profile (SCAP) Method”, Spring 2007, International Journal of Digital Evidence, vol. 6, issue 1.
Franklin et al., “Remote Detection of Virtual Machine Monitors with Fuzzy benchmarking”, ACM SIGOPS Operating Systems Review, V42, Issue 3, Apr. 2008.
Emmanouil Vasilomanolakis, “A honeypot-driven cyber incident monitor: Lessons learned and steps ahead”; Sep. 2015; SIN '15: Proceedings of the 8th International Conference on Security of Information and Networks; Publisher: ACM; pp. 1-7.
Related Publications (1)
Number Date Country
20200327422 A1 Oct 2020 US
Continuations (1)
Number Date Country
Parent 15192845 Jun 2016 US
Child 16902289 US