Method, system, and device of authenticating identity of a user of an electronic device

Information

  • Patent Grant
  • 10747305
  • Patent Number
    10,747,305
  • Date Filed
    Sunday, May 19, 2019
    5 years ago
  • Date Issued
    Tuesday, August 18, 2020
    4 years ago
Abstract
A method for confirming identity of a user of a mobile electronic device, the method including: receiving touch data from a touch-screen of the mobile electronic device; receiving acceleration data from an accelerometer of the mobile electronic device; correlating between the touch data and the acceleration data; based on the correlating, generating a user-specific trait indicative of said user. The method further includes storing a reference value of the user-specific trait, indicative of said user; in a subsequent usage session of the mobile electronic device, generating a current value of the user-specific trait correlating between touch data and acceleration data; and based on a comparison between the current value of the user-specific trait and the reference value of the user-specific trait, determining whether or not a current user of the mobile electronic device is an authorized user of the mobile electronic device.
Description
FIELD

The present invention is related to the security of electronic devices and systems.


BACKGROUND

Millions of people around the world utilize mobile electronic devices, such as smartphones and tablets, in order to perform various activities. Such activities may include, for example, browsing the Internet, sending and receiving electronic mail (email) messages, taking photographs and videos, engaging in a video conference or a chat session, playing games, or the like.


Some activities may be privileged, or may require authentication of the user in order to ensure that only an authorized user engages in the activity. For example, a user may be required to enter a username and a password in order to access an email account, or in order to access an online banking interface or website.


SUMMARY

The present invention may include, for example, systems, devices, and methods for detecting identity of a user of a mobile electronic device, and for determining that a mobile electronic device is used by a fraudulent user.


In accordance with the present invention, for example, a method for confirming identity of a user of a mobile electronic device may comprise: receiving touch data from a touch-screen of the mobile electronic device; receiving acceleration data from an accelerometer of the mobile electronic device; correlating between the touch data and the acceleration data; based on the correlating, generating a user-specific trait indicative of said user.


In accordance with the present invention, for example, the method may comprise: storing a reference value of the user-specific trait, indicative of said user; in a subsequent usage session of the mobile electronic device, generating a current value of the user-specific trait correlating between touch data and acceleration data; and based on a comparison between the current value of the user-specific trait and the reference value of the user-specific trait, determining whether or not a current user of the mobile electronic device is an authorized user of the mobile electronic device.


In accordance with the present invention, for example, storing comprises: storing within said mobile electronic device; and said comparison is performed within said mobile electronic device.


In accordance with the present invention, for example, storing comprises storing externally to said mobile electronic device; and said comparison is performed externally to said mobile electronic device, and comprises wirelessly receiving at the mobile electronic device an indication of said comparison.


In accordance with the present invention, for example, said touch data comprises non-tactile touch data indicating a hovering user gesture in proximity to said touch-screen.


In accordance with the present invention, for example, the method may comprise: receiving gyroscope data from a gyroscope of the mobile electronic device; correlating between the touch data and the gyroscope data; based on the correlating between the touch data and the gyroscope data, generating another user-specific trait indicative of said user.


In accordance with the present invention, for example, the method may comprise: capturing non-tactile motion data indicating a user gesture; correlating between the non-tactile motion data and the acceleration data; based on the correlating between the non-tactile motion data and the acceleration data, generating another user-specific trait indicative of said user.


In accordance with the present invention, for example, the method may comprise: comparing between (a) a currently-calculated value of the user-specific trait, corresponding to a current usage of the mobile electronic device, and (b) a previously-calculated value of the user-specific trait, corresponding to a previous usage of the mobile electronic device; and based on a comparison result, performing at least one of: restricting access of said user to an online service; restricting access of said user to an application installed on said mobile electronic device; requiring the user to authenticate his identity to an online service; requiring the user to authenticate his identity to an application installed on said mobile electronic device.


In accordance with the present invention, for example, the method may comprise: based on said touch data, estimating user-specific motor control parameters and user-specific motor control noise; and based on the estimated user-specific motor control parameters and user-specific motor control noise, differentiating between said user and another user.


In accordance with the present invention, for example, the method may comprise: based on said touch data, estimating user-specific motor control parameters and user-specific motor control noise of a control loop which comprises translation error and gesture velocity error; and based on the estimated user-specific motor control parameters and user-specific motor control noise, differentiating between said user and another user.


In accordance with the present invention, for example, the method may comprise: based on said correlating, estimating a user-specific physiological trait of said user; and based on the user-specific physiological trait, differentiating between said user and another user.


In accordance with the present invention, for example, estimating the user-specific physiological trait of said user comprises at least one of: estimating a length of a finger of the user; estimating a width of a finger of the user; estimating a size-related parameter of a finger of the user; estimating a distance between a tip of a finger of the user and another part of a hand of the user.


In accordance with the present invention, for example, the method may comprise: based on said correlating, estimating a user-specific behavioral trait of said user; and based on the user-specific behavioral trait, differentiating between said user and another user.


In accordance with the present invention, for example, estimating the user-specific behavioral trait of said user comprises: determining that said user typically performs a particular inadvertent gesture while performing a user-intended input-providing gesture.


In accordance with the present invention, for example, estimating the user-specific behavioral trait of said user comprises one or more of: determining that said user typically moves the mobile electronic device at a particular direction while performing a touch gesture; determining that said user typically rotates the mobile electronic device while performing a touch gesture; determining that said user typically slants the mobile electronic device at a particular angle while performing a touch gesture.


In accordance with the present invention, for example, estimating the user-specific behavioral trait of said user comprises: determining that said user typically holds the mobile electronic device with a first hand of the user and concurrently performs an input-providing gesture with a second hand of the user.


In accordance with the present invention, for example, estimating the user-specific behavioral trait of said user comprises: determining that said user typically holds the mobile electronic device with a single hand and concurrently performs an input-providing gesture with said single hand.


In accordance with the present invention, for example, the method may comprise: based on said correlating, estimating a first user-specific behavioral trait of said user which corresponds to a first usage scenario; based on said correlating, estimating a second user-specific behavioral trait of said user which corresponds to a second usage scenario; based on the first and second user-specific behavioral traits, differentiating between said user and another user.


In accordance with the present invention, for example, the method may comprise: based on said correlating, estimating a first user-specific behavioral trait of said user which corresponds to a first usage scenario in which said user operates said mobile electronic device while the user holds said mobile electronic device; based on said correlating, estimating a second user-specific behavioral trait of said user which corresponds to a second usage scenario in which said user operates said mobile electronic device while the user does not hold said mobile electronic device; based on the first and second user-specific behavioral traits, differentiating between said user and another user.


In accordance with the present invention, for example, a mobile electronic device may be configured to confirm identity of a user of said mobile electronic device; the mobile electronic device comprising: a touch-screen to receive touch data; an accelerometer to receive acceleration data; a correlator module to correlate between the touch data and the acceleration data; a trait extractor module to generate a user-specific trait indicative of said user, based on correlation between the touch data and the acceleration data.


The present invention may provide other and/or additional benefits or advantages.





BRIEF DESCRIPTION OF THE DRAWINGS

For simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity of presentation. Furthermore, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. The figures are listed below.



FIG. 1 is a schematic block diagram illustration of a mobile electronic device, in accordance with some demonstrative embodiments of the present invention;



FIG. 2 is an illustration of three graphs, which demonstrate acceleration as a function of time over three separate axes, in accordance with some demonstrative embodiments of the present invention;



FIG. 3 is an illustration of a graph of the main axes of dimension-reduced space of accelerometer reaction to tapping, in accordance with some demonstrative embodiments of the present invention;



FIG. 4 is an illustration of a graph depicting feature space, in accordance with some demonstrative embodiments of the present invention; and



FIG. 5 is a flow-chart of a method in accordance with some demonstrative embodiments of the present invention.





DETAILED DESCRIPTION OF THE PRESENT INVENTION

In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of some embodiments. However, it will be understood by persons of ordinary skill in the art that some embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, units and/or circuits have not been described in detail so as not to obscure the discussion.


Applicants have realized that each user of a mobile electronic device may handle the device in a unique manner which may be detected and may be utilized for confirming the identity of the user, or for other security-related purposes or fraud-detection purposes. Applicants have realized, for example, that different users cause different type of acceleration to the mobile device when they perform the same operation or touch-gesture (e.g., swiping or tapping or scrolling on the touch-screen), or may tilt or rotate or slant the mobile device in different, unique ways when they perform such gestures or operations.


The present invention may include, for example, biometric modalities, personal trait extraction modalities, and/or identity authentication modalities which may be used in conjunction with a mobile or portable electronic device, and may utilize a combination of (or correlation between) acceleration parameters and/or touch data. Such parameters may be used in order to deduce unique insights regarding the identity or possible identity of the user of the mobile electronic device, or in order to determine whether or not the user is considered to be the “genuine” user, or in contrast, an attacker or impersonator or “fraudster”.


The present invention may capture, monitor, or otherwise utilize for deduction of insights, the coupling or correlation between (a) touch-screen interaction, or other user gestures, and (b) accelerometer(s) measurements and/or gyroscope(s) measurements. The present invention may further deduce and/or utilize one or more other biometric traits or identity-authentication traits, for example, touch or swipe locations, pressure dynamics, identification of physiological regions (e.g., in the hand of the user) that move while other regions do not move when a user gesture is performed, or other suitable traits in order to assist in identification and/or authentication of the user of the mobile device.


The present invention may sufficiently capture unique qualities of a human user to be usable as a biometric for authentication. Different people may have different preferred orientations for holding or grasping (e.g., in their hand) a mobile device, and/or a different way in which they press or touch or tap the touch-screen (e.g., the applied force, the duration of the tapping, or the like).


Applicants have realized that physical traits such as, for example, hand size, hand mass, or other traits, may change the way in which a user's interacting hand and his device-holding hand are correlated. In a demonstrative example, the present invention may distinguish or differentiate between (a) a person who is using one single hand for both holding the mobile device and tapping on its touch-screen (or performing other touch gesture), and (b) a person who is using one hand to hold the mobile device and another hand to tap on its touch-screen (or to perform other touch gesture or user gesture).


Moreover, as Applicants have realized, different tap locations (e.g., top-left corner or region of the touch-screen, versus bottom-right corner or region) may create different torque(s) on the mobile device, further depending on the tap strength, the offset of the mobile device in the hand (e.g., the device being held high or low, with the palm area or the fingers area, or the like) and/or the size of the hand (e.g., if the same hand is used for both holding the device and tapping on its touch-screen).


The terms “mobile device” or “mobile electronic device” as used herein may include, for example, a smartphone, a cellular phone, a mobile phone, a tablet, a handheld device, a portable electronic device, a portable gaming device, a portable audio/video player, or the like.


The term “genuine user” as used herein may include, for example, an owner of a mobile electronic device; a legal or lawful user of a mobile electronic device; an authorized user of a mobile electronic device; a person who has legal authorization and/or legal right to utilize a mobile electronic device, for general purpose(s) and/or for one or more particular purpose(s); or the person who had originally defined user credentials (e.g., username and password) for performing an activity through the mobile electronic device.


The term “fraudulent user” as used herein may include, for example, any person who is not the “genuine user” of the mobile electronic device; an attacker; an intruder; a man-in-the-middle attacker; a man-in-the-browser attacker; an unauthorized user; an impersonator; a hacker; a cracker; a person attempting to hack or crack or compromise a security measure utilized by the mobile electronic device or utilized by an activity or service accessible through the mobile electronic device; a fraudster; a human fraudster; a “bot” or a malware or an automated computerized process (e.g., implemented by using software modules and/or hardware components) which attempts to imitate human behavior or which attempts to act as if such “bot” or malware or process was the genuine user; or the like.


The term “user gesture” as used herein may include, for example, a gesture or movement or other operation that a user of a mobile device performs on a touch-screen of the mobile device, or performs in proximity to the touch-screen of the mobile device; touch gesture; tap gesture or double-tap gesture or prolonged tap gesture; scroll gesture; drag gesture, or drag-and-drop gesture; release gesture; click or double-click gesture; hovering gestures, in which the user may hover with his finger(s) or hand(s) in proximity to the touch-screen of the mobile device but without necessarily touching the touch-screen device; hovering gestures that may be captured by a camera of the mobile device, or by a touch-screen of the mobile device (e.g., by taking into account electrical and/or magnetic effects of such gestures); hovering gestures which may be generally similar to touch-free hovering gestures that a Samsung Galaxy S4 smartphone is able to detect; finger(s) gestures and/or hand(s) gestures made in a three-dimensional space, for example, similar to movement gestures that a Microsoft Kinect motion sensing input device is able to sense; and/or a combination of such gestures or other gestures.


Reference is made to FIG. 1, which is a schematic block diagram illustration of a mobile device 100 in accordance with the present invention. Mobile device 100 may comprise, for example, a processor 101, a memory unit 102, a storage unit 103, a wireless transceiver 104, a touch-screen 105, one or more accelerometers 106, and one or more gyroscopes 107. Mobile device 100 may further comprise, for example, one or more hovering sensors 108, one or more motion gesture sensor(s) 109, a correlator 131, a trait extractor 132, a trait repository 133, a profile constructor module 134, an identity authenticator module 135, and a physiological trait estimator 139. Mobile device 100 may comprise other suitable hardware components and/or software modules, for example, a power source (e.g., a rechargeable battery), an Operating System, software applications, or the like.


Touch-screen 105 may receive user gestures, for example, tapping, double-tapping, dragging, pressing, holding down, releasing, scrolling, pinching fingers for zoom-out, spreading fingers for zoom-in, or the like). Touch data may be stored in a touch data repository 125, optionally in association with a time-stamp associated with each touch data-item being stored.


Accelerometer(s) 106 may comprise, for example, a three-axis accelerometer able to measure acceleration, separately, along three axes (X axis, Y axis, Z axis). Accelerometer readings may be stored in an acceleration data repository 126, optionally in association with a time-stamp associated with each acceleration data-item being stored.


Gyroscope(s) 107 may comprise, for example, a three-axis gyroscope able to measure orientation and/or rotation, e.g., separately along three axes (X axis, Y axis, Z axis). The measured data may be stored in a gyroscope data repository 127, optionally in association with a time-stamp associated with each orientation/rotation data-item being stored.


Hovering sensor(s) 108 may comprise, for example, one or more sensors (e.g., optical sensors, magnetic sensors, electric sensors, touch-screen components, camera components, or the like) able to sense hovering gesture(s) of the user of device 100, for example, in a three-dimensional space or separately along three axes (X axis, Y axis, Z axis). The measured data may be stored in a hovering data repository 128, optionally in association with a time-stamp associated with each hovering data-item being stored.


Motion gesture sensor(s) 109 may comprise, for example, one or more sensors able to sense motion gesture(s) of the user of device 100, for example, in a three-dimensional space or separately along three axes (X axis, Y axis, Z axis). The measured data may be stored in a motion gesture data repository 129, optionally in association with a time-stamp associated with each motion gesture data-item being stored.


Correlator 131 may search for, or identify or determine, correlation among (a) acceleration data and/or gyroscope data, and (b) touch data and/or hovering data and/or motion gesture data. Trait extractor 132 may determine one or more user-specific traits or characteristics which may be, or may appear to be, unique to (or indicative of) a particular user, based on one or more correlation(s) identified by correlator 131. Trait values or trait indicators, or data indicative of extracted user-specific traits, may be stored in a trait repository 133.


Profile constructor module 134 may utilize a learning algorithm to construct a user profile based on the one or more user-specific traits identified by trait extractor 132 and stored in trait repository 133. Profile construction may be performed over a per-defined time period (e.g., five hours, or three days) of the user interacting with device 100; or over a pre-defined number of interactions (e.g., 12 or 25 or 100 interactions) of the user with device 100. Optionally, profile constructor module 134 may dynamically extend or shorten or modify the required time-period or interaction number, for example, if traits of a particular user are distinctive and are rapidly extracted over a shorter period of time or over a smaller number of user interactions. Constructed user profiles may be stored in a user profile repository, which may be internal to device 100 or may be external thereto (e.g., in a remote server or in a “cloud computing” server), optionally with an associated flag or parameter indicating whether a particular user profile is fully constructed or under construction.


Identity authenticator module 135 may capture one or more traits of a user who is currently utilizing device 100, and may analyze and determine whether or not these traits are similar to, or different from, user-specific traits in a user profile associated with a user that is believed to be a “genuine” user of device 100. The analysis results may be notified by identity authenticator module 135 to other units or modules, within device 100 (e.g., an application or process running in device 100) and/or externally to device 100 (e.g., on a remote server, on a remote web-site or web-page, in a “cloud” server or device).


For example, if the analysis indicates that the current user of device 100 is not the genuine user, then, one or more fraud-stopping operations or additional authentication operations may be triggered and performed, for example, requiring the user to re-enter his password or pass-phrase or Personal Identification Number (PIN), requiring the user to answer one or more security questions, requiring the user to perform log-in operations or to provide account details (e.g., to provide date-of-birth data), requiring the user to place a phone call to a fraud department or a security department of a service or entity associated with an application running on device 100; blocking or restricting or curtailing access of the user to one or more services or features which may be generally available through device 100; or the like.


Correlator 131 may identify user-specific physiological correlations. For example, correlator 131 may identify one or more geometric place(s), on touch-screen 105 or in a space proximate to touch-screen 105, in which a user gesture is associated with movement of a user body part (e.g., the thumb; one or more fingers; the palm or wrist) while also being associated with rest or non-movement of other body parts of the user. Based on the user-specific physiological correlations, trait extractor 132 may extract user-specific physiological trait(s).


In a demonstrative example, trait extractor 132 may determine that for the user Adam, a vertical scroll-down touch-gesture is typically associated with movement of the root of the thumb, while the other fingers are at rest and while the wrist or palm-base are at rest; whereas, for the user Bob, a vertical scroll-down touch-gesture is typically associated with both movement of the root of the thumb, as well as with slight rotational movement of fingers that hold or support the rear of the mobile device, and while the wrist or palm-base are at rest. This may be subsequently used for user authentication or for identity confirmation, to distinguish between a “genuine” user (e.g., Adam) and a fraudulent user or non-genuine user (e.g., Bob) when the user of device 100 performs a similar user gesture.


In another demonstrative embodiment, correlator 131 may determine that the user of device 100 (e.g., the “genuine” user), while performing a primary gesture or an intended gesture (e.g., required in order to provide user input to device 100), typically also performs a secondary gesture an inadvertent gesture (e.g., not required in order to provide user input to device 100). For example, the primary gesture may be a scrolling gesture, a zoom-in or zoom-out gesture, a dragging gesture, a tapping gesture, or other user input gesture; whereas, the secondary gesture (e.g., the inadvertent or unintended gesture, to which the user may not even be aware) may be, for example, slight or significant rotating or spinning of device 100, slight or significant movement of device 100 (e.g., in a particular direction), slight or significant tilting or slanting of device 100 (e.g., at a particular angle or range-of-angles), or the like.


In another demonstrative embodiment, correlator 131 may be associated with, or may operate in conjunction with, physiological trait estimator 139 which may be able to indirectly estimate one or more physiological traits or physiological characteristics of the user of device 100, and particularly, of the hand(s) or finger(s) (e.g., a finger, a thumb, or the like) of that user. For example, physiological trait estimator 139 may estimate a width of a finger or thumb based on a width of a swiping trace performed by the finger on touch-screen 105; may estimate a length of a finger or thumb based on a radius of a circular or arched or curved swiping motion on touch-screen 105; may estimate the distance between the tip of a finger or thumb and the palm of the hand, or the wrist; may estimate other dimensions of hand-parts, or relations between such hand parts; or the like. Physiological trait estimator 139 may thus estimate physiological characteristics which may be unique to a particular user, and may assist in confirming user identity and/or in detecting a non-genuine user impersonating the genuine user.


Additionally or alternatively, correlator 131 may be associated with, or may operate in conjunction with, a motor control estimator 138 which may estimate user-specific motor control parameters based on the user's interaction with mobile device 100. Such parameters may include, for example, parameters of the action-perception loop modeling the hand-eye coordination, as well as control loop parameter, motor noise, perception noise, or the like. Motor control estimator 138 may estimate user-specific parameters of motor control, which may be more inherent to the user and may be less action-dependent.


In a demonstrative implementation, for example, motor control estimator 138 may track a user gesture on the touch-screen (e.g., a scroll or swipe gesture). The movement or gesture may begin at rest in a start-point (X0, Y0), and may end at rest in an end-point (X1, Y1). A demonstrative control loop of the second order, for example, may assume that the force of the hand is governed by a linear combination of two error terms: a translation error, and the current velocity error.


The translation error may be represented as:

Δx=(x1−x(t))


The current velocity error may be represented as:







Δ






v
x


=


d

d





t




x


(
t
)







The control loop may be represented (for the X-axis, and similarly and separately also for the Y-axis) as:









d
2



x


(
t
)




d






t
2



=



α
x


Δ





x

+


β
x



v
x


+

n
x






In the last equation, αx and βx are control loop parameters, and nx is motor control noise (e.g., Gaussian random variable). Accordingly, motor control estimator 138 may estimate or may simulate trajectories which may be similar to human trajectories; and although a velocity curve may be different for each movement of the same movement, the velocity curve may be generated by the same model parameters of that specific user. Motor control estimator 138 may thus estimate these three parameters (for the X-axis, and/or for the Y-axis), thereby estimating user-specific motor control traits which may be used for differentiating between a genuine user and an impersonator or attacker, regardless of the specific movement(s) or gesture(s) performed. The above is only a demonstrative example, and motor control estimator 138 may utilize other motor control estimations, forward model(s), feedback model(s), estimation of similar peak velocity (or other movement properties) for different movements (e.g., if the error terms are distorted by a non-linear function).


Additionally or alternatively, correlator 131 may identify user-specific behavioral correlations. For example, correlator 131 may identify that when a particular user performs a particular user-gesture, performance of the gesture affects in a particular way the acceleration data and/or the orientation/rotation data of device 100. Based on the user-specific behavioral correlations, trait extractor 132 may extract user-specific behavioral trait(s).


In a demonstrative example, trait extractor 132 may determine that for the user Adam, a horizontal swipe gesture is typically associated with a counter-clockwise rotation in the range of 10 to 15 degrees around a vertical axis (e.g., a rotation axis parallel to the longest dimension of device 100); whereas, for the user Bob, a horizontal swipe gesture is typically associated with a clockwise rotation in the range of 5 to 10 degrees (or, with substantially no rotation at all) around that vertical axis. This may be subsequently used for user authentication or for identity confirmation, to distinguish between a “genuine” user (e.g., Adam) and a fraudulent user or non-genuine user (e.g., Bob) when the user of device 100 performs a similar user gesture.


Correlator 131 may be configured to search for, and detect, other user-specific behavioral correlations, for example: correlations based on the manner of holding device 100 (e.g., a primary angle of holding), and the effect of various user gestures on such holding or on the primary angle of holding; correlations based on the stability or the shakiness of device 100 (e.g., optionally taking into account the amount and/or frequency and/or timing of hand vibrations or hand movements), and the effect of various user gestures on such device stability or shakiness, or on stability or shakiness of the hand of the user that holds or operates device 100; correlations based on movement, spinning, rotation and/or acceleration of device 100, along one axis or two axes or three axes, as a result of (or concurrently with) a user gesture such as, for example, tap, double-tap, prolonged tap, release, drag, drag and drop, click, double-click, rotation or movement of an on-screen object, rotation of device 100 by 90 degrees or 180 degrees or 270 degrees, horizontal or vertical or diagonal swipe gesture, scroll gesture, zoom-in or zoom-out gestures, user operations on physical buttons or sliders or interface components of device 100 (e.g., volume interface, camera button, button for capturing an image or a video), or the like.


Correlator 131 may further detect correlations based on movement, spinning, rotation and/or acceleration of device 100, along one axis or two axes or three axes, that occur prior to or subsequent to a user gesture. For example, correlator 131 may detect that a first particular user typically tilts the phone from being generally perpendicular to the ground, to being generally parallel to the ground, immediately prior to performing a zoom-out gesture (e.g., a “pinching” gesture with two fingers on touch-screen 105). Similarly, correlator 131 may detect that a second particular user typically rotates the phone counter-clockwise, immediately subsequent to performing a zoom-in gesture (e.g., spacing apart two fingers on touch-screen 105). In some implementations, for example, a correlation may be detected while the user gesture is performed, immediately before the user gesture is performed (e.g., within 0.5 seconds prior to the user gesture), and/or immediately after the user gesture is performed (e.g., within 0.5 seconds subsequent to the user gesture).


Optionally, correlator 131 may detect other suitable correlations, and may take into account other types of readings or sensed data, for example, data indicating a temperature or moisture level or sweat level which may be associated with a user gesture, data indicating the amount of pressure or force applied by a user (e.g., when pressing on touch-screen 105), or the like.


In a demonstrative example, a first user may typically scroll down with his finger on touch-screen 105 while slightly rotating the mobile device 100 around its longest axis; and a correlation may be identified between the respective touch data and acceleration/orientation data, indicative of the first user. In contrast, a second user may typically scroll down while maintaining the mobile device 100 non-rotating, or while rotating mobile device 100 at a different direction or angle, or at a different acceleration value, thereby allowing to identify a different correlation, indicative of the second user.


Optionally, the present invention may identify, create and utilize a first set of behavioral traits which correspond to the behavior of a particular user when he is utilizing his mobile device in a first holding scenario (e.g., when the user is holding the mobile device in his hand), and a second (different) set of behavioral traits which correspond to the behavior of that particular user when he is utilizing his mobile device in a second holding scenario (e.g., when the mobile device is placed on a table or flat surface and the user operates the mobile device without holding it). Accordingly, the present invention may create and utilize a behavioral profile for that user, which may comprise multiple sub-profiles of behavioral traits that correspond to such multiple usage scenarios by the same (e.g., “genuine”) user. In a subsequent usage of the mobile device, the system may compare the behavioral traits of the subsequent user, to each one (e.g., separately) of the pre-stored sets of behavioral traits (or behavioral sub-profiles), in order to detect or determine whether that subsequent user is the “genuine” user operating in one of the known usage scenarios, or alternatively a fraudulent user or attacker. Similarly, the present invention may generate and/or utilize complex profiles that may comprise of sub-profiles or sets of traits (e.g., behavioral traits, physiological traits, motor control traits), such that each set or sub-profile may correspond to a particular usage scenario or a particular holding scenario of the user; and a subsequent usage may be compared, separately, to each one of those sub-profiles (or sets of traits) in order to determine user authenticity.


The terms “correlation”, “correlator”, “to correlate”, and similar or equivalent terms which may be used herein, are used for demonstrative purpose only; they may include, for example, statistical correlation, or statistically-significant correlation, or any other type of relation or indication or matching between two parameters or between groups of values. In some embodiments, there need not be statistically-significant correlation between, for example, touch data and acceleration data, in order to identify or extract unique user trait(s); but rather, there may be other type of relation or matching between touch-data and acceleration data in order to determine such “correlation”.


In accordance with the present invention, mobile device 100 may continuously track and/or monitor the correlation between touch-data and acceleration/orientation data. Correlation values may be used to determine user-specific traits, that are indicative of the user of the mobile device 100, which may be regarded initially as the “genuine” user. Then, during subsequent usage of the mobile device 100, correlation between touch-data and acceleration/orientation data may be tracked and identified, and may be compared to the correlation previously-determined for the genuine user, in order to confirm that a current user is indeed the genuine user, or in order to determine or to estimate that a current user is a non-genuine user.


In a demonstrative implementation, an application or a website may be accessible through device 100 through an access control process or a user authentication process. Such application or website may be, for example, an email account, a social network account, a video conference application, a chat application, an online banking application or website, a securities trading application or website, an electronic commerce account or website, or the like. The user may be prompted to create a new user account (e.g., define a username and password); and then, or in parallel, user-specific traits may be captured through passive means and/or active means, which may be known to the user or may be hidden from the user.


For example, a profile creation page or application may require the user to perform various touch operations (e.g., tapping, scrolling, dragging, or the like), and may capture touch data as well as acceleration/orientation data, which may then be correlated in order to identify a biometric trait indicative of the user who is currently creating the profile, or who is otherwise believed to be a genuine user (e.g., based on password entry and/or responses to security questions or other challenge-response mechanisms). Optionally, an active challenge may be posed to the user, for example, by explicitly asking the user to perform one or more particular touch gestures on touch-screen 105, either as “hidden” challenges (in which the user is not aware that he is actively challenged for security purposes) or as non-hidden challenges (in which the user is advised that, as a security measure, he is required to perform certain touch gestures in order to extract biometric traits).


Reference is made to FIG. 5, which is a flow-chart of a method in accordance with some demonstrative embodiments of the present invention. The method may be implemented by a mobile electronic device, by one or more hardware components and/or software modules of a mobile electronic device, by a system, or the like.


The method may include, for example, capturing at least one of touch data, hovering data, motion data, gesture data (block 510).


The method may include, for example, capturing at least one of acceleration data, gyroscope data, device orientation/rotation data, principal axes rotation data (e.g., normal axis or yaw, lateral axis or pitch, longitudinal axis or roll) (block 520).


The operations of block 520 may be performed simultaneously or concurrently with, or in parallel to, the operations of block 510.


The method may include, for example, correlating or matching (block 530) between the data captured in block 510 and the data captured in block 520.


The method may include, for example, extracting a user-specific trait (block 540) based on the correlating or matching of block 530. The user-specific trait may include, for example, one or more behavioral traits, physiological traits, motor control traits, or other user-specific characteristics.


The method may include, for example, subsequently, confirming user identity based on said user-specific trait (block 550).


Other suitable operations may be used in accordance with the present invention.


Some embodiments of the present invention may be utilized, or may operate, in conjunction with methods, algorithms, devices and/or systems which are described in PCT International Application Number PCT/IL2011/000907, titled “Method and Device for Confirming Computer End-User Identity”, published on Jun. 7, 2012 as International Publication Number WO/2012/073233, which is hereby incorporated by reference in its entirety; and/or in U.S. patent application Ser. No. 13/877,676, filed on Apr. 4, 2013, which is hereby incorporated by reference in its entirety.


In accordance with the present invention, correlation between touch-data and acceleration/orientation data may be identified and/or checked locally in mobile device 100; or remotely, such as in a remote server which may receive such data via a wireless communication link from mobile device 100; or by using other suitable architecture, for example, a hybrid architecture in which some operations may be performed locally and other operations may be performed remotely. Accordingly, components or modules that are depicted, for demonstrative purposes, as being included in mobile device 100, may be implemented at a remote server or within other suitable units. The present invention may be implemented in a stand-alone mobile device, such that data collection and processing may be performed within device 100; or in a client-server architecture, such that device 100 may collect data and may wirelessly transmit the collected data to a remote server for processing and analysis; or in a “cloud computing” architecture in which data is stored remotely and is also processed remotely. Other suitable architectures may be used, to deploy a system in which a particular mobile device “knows” or recognizes its genuine user, or, to deploy a system in which a particular application or website “know” or recognize a genuine user, based on the above-mentioned correlations.


In a demonstrative experiment in accordance with the present invention, multiple participants were asked to hold a particular mobile device (an iPad tablet), to drag (with a finger) a displayed green circle towards a displayed red target, and then to release the dragged item once it reached the red target. Accelerometer data and touch data were collected while performing the requested operations.


The experiment measured the touch and release signals, as well as accelerometer measurements; and then triggered the acceleration data according to the touch time. FIG. 2 depicts three graphs 201-203, which demonstrate acceleration as a function of time over three separate axes, thereby demonstrating at least two identifying characteristics which may be used as a user-specific trait. As a first identifying characteristic, the phasic level (observed at the X axis) may have different values for different people, corresponding to different posture of the mobile device. As a second identifying characteristic, the transient shape once the device is clicked (observed at the Z axis) may have different values for different people. This data may be transformed or analyzed, for example, by using dimension reduction techniques (e.g., kernel-principle-component-analysis), thereby demonstrating the biometric capability of synergizing between touch data and acceleration data.


Reference is made to FIG. 3, which demonstrates a graph 300 of the main axes of the dimension-reduced space of the accelerometer reaction to tapping. Each small item in graph 300 represents one trial, and each shape or character in graph 300 (e.g., circle, square, diamond, triangle) represents a different user. This drawing demonstrates identifiable clusters 301-309 of trials, each such cluster corresponding to a different user.


In certain scenarios, posture data (e.g., phasic response) may be neglected or may not be available, for example, if the mobile device is operated while being placed on a table or a flat surface and is not hand-held by the user. In such scenarios, only the device's kinematics during taps may be taken into account, and still the present invention may capture sufficient information for biometric functions. Reference is made to FIG. 4 which illustrates a graph 400 depicting the feature space, where each dot represents a trial; greyed circles represent trials performed by one particular user, and black circles represent trials performed by the other participants. This drawing demonstrates dimension reduction when only the device's kinematics are taken into account, showing that, still, sufficient significant biometric information may be captured and determined.


The present invention may be used in order to automatically identify that a user (e.g., an attacker or a “fraudster”) is attempting to pose as (or impersonate, or “spoof”) another user (e.g., the “real” user or the genuine user). In accordance with the present invention, the attacker would need to carefully and correctly imitate the exact accelerometer response for tapping (or for other suitable touch-screen operations, such as scrolling, dragging, releasing), taking into account the particular kinematics properties of the genuine user; and such imitation may be extremely difficult and unlikely, or even impossible, for most attackers.


The present invention may utilize signal processing and/or machine learning techniques, in order to build or generate a template model or a profile which corresponds to the genuine user; and then compare subsequent instance(s) or sample(s) to the pre-built (and locally stored, or remotely stored) model or profile. If the subsequent samples are consistent with the pre-built model or profile, then a first output score may be generated (e.g., having a high value in a predefined numeric range, such as a value of 98 on a scale of 0 to 100); whereas, if the subsequent samples are inconsistent with the pre-built model or profile, then a second output score may be generated (e.g., having a lower value on the predefined numeric range, such as a value of 34 on the scale of 0 to 100). In some implementations, an output score greater than a threshold value may be used (alone, or in combination with other biometric traits and/or other authentication measures) as an indication that the current user is the genuine user; whereas an output score lower than the threshold value may be used (alone, or in combination with other biometric traits and/or other authentication measures) as an indication that the current user is not the genuine user.


The present invention may further be used to differentiate or distinguish between the genuine (human) user, and a robot or a machine-operable module or function (e.g., implemented as a computer virus, a Trojan module, a cyber-weapon, or other malware) which attempts to automatically imitate or emulate or simulate movement of a cursor or other interaction with a touch-screen. For example, false identity created by automated malware may be detected by the present invention as such automated malware may lack the characterization of human (e.g., manual) behavior, such as the touch features (e.g., speed, pressure) and/or its accelerometer correlated measurements.


The present invention may operate and may provide an efficient biometric or user-authentication modality, without capturing, storing, or otherwise identifying any Personally Identifiable Information (PII). For example, the present invention may be used to distinguish between a genuine user and a fraudster, without knowing any PPI of the genuine user and/or of the fraudster.


The present invention may detect correlations and extract user-specific traits based on passive data collection and/or based on active challenges. In passive data collection, the mobile device may detect that the user is performing a particular operation (e.g., a vertical scroll gesture), and may further detect that performing this gesture affects in a user-specific way the acceleration and/or the orientation/rotation of the mobile device. In an active challenge, the mobile device (or an application or process thereof) may actively present a challenge to the user, such as, a requirement to the user to perform horizontal scrolling, in order to capture data and detect user-specific correlation(s). The active challenge may be hidden or may be unknown to the user, for example, implemented by creating a Graphical User Interface (GUI) that requires the button to scroll in order to reach a “submit” button or a “next” button or a “continue” button, thereby “forcing” the user to unknowingly perform a particular user-gesture which may be useful for correlation detection or for extraction of user-specific traits, as described. Alternatively, the active challenge may be known to the user, and may be presented to the user as an additional security feature; for example, by requesting the user to drag and drop an on-screen object from a first point to a second point, as an action that may be taken into account for confirming user identity.


Some embodiments of the present invention may be implemented, for example, as a built-in or integrated security feature which may be a component or a module of a mobile device, or may be a downloadable or install-able application or module, or plug-in or extension; or as a module of a web-site or web-page, or of a client-server system or a “cloud computing” system; or as machine-readable medium or article or memory unit able to store instructions and/or code which, when executed by the mobile device or by other suitable machine (e.g., a remote server, or a processor or a computer) cause such machine to perform the method(s) and/or operations described herein. Some units, components or modules, that are shown in FIG. 1 for demonstrative purposes as comprised within mobile device 100, may be implemented externally to mobile device 100, may be implemented in a remote server, a web server, a website or webpage, a “cloud computing” server or database, a client/server system, a distributed system, a peer-to-peer network or system, or the like.


In some embodiments of the present invention, the analysis or correlation or matching (e.g., between accelerometer/gyroscope data, and touch-data or hovering data or other user-gesture data) may be location-based and/or application-based, or may otherwise take into account a geographical location or geo-spatial location of the mobile device or the application(s) being used or that are installed on the device. In a demonstrative example, a suitable module (e.g., a location-aware module or location-determining module) in the mobile device may determine the current location of the mobile device, based on GPS data or Wi-Fi data or cellular triangulation data or mobile network cell data or other location-identification techniques. The mobile phone may then utilize a suitable module (e.g., a correlator or matching module between location and user-specific behavioral usage traits) in order to deduce or determine, for example: that when the user utilizes his mobile device in a first location (e.g., in his office), then the mobile phone is typically placed horizontally on a flat surface (e.g., a table); that when the user utilizes his mobile phone in a second location or type of location (e.g., outdoor, on the street, in the park), then the mobile phone is typically held by the hand of the user at a slanted angle or diagonally (e.g., at approximately 45 to 60 degrees relative to the ground); that when the user utilizes his mobile phone in a third location or type of location (e.g., at a Point-Of-Sale (POS) terminal or register or cashier, at a supermarket or a retail store), then the mobile phone is typically held generally horizontally by the hand of the user (e.g., generally parallel to the ground); that when the user utilizes his mobile phone in a fourth location or type of location (e.g., at an Automatic Teller Machine (ATM) or a vending machine), then the mobile phone is typically held generally vertically by the hand of the user (e.g., at an angle of approximately 90 degrees, or between 80 to 100 degrees, relative to the ground); or the like. These determinations may be location-based or location-aware, thereby triangulating or crossing among three dimensions, namely, behavioral user-specific traits (e.g., holding the phone diagonally), gesture data (e.g., performing a scroll-down gesture), and location data (e.g., when utilizing the phone at a retailer); and such determinations may be part of the user-specific profile of that user. In a subsequent usage of the mobile device, similar determinations may be made, in order to analyze whether or not a current user is indeed the same user as in previous usage session(s) or is a “genuine” user. In a demonstrative example, this three-prone approach may raise an alert if, for example, typically the user of the mobile device holds his mobile device horizontally when performing a scroll-operation at a Point of Sale terminal; and in a subsequent usage session of the mobile device, a user holds that phone vertically when performing a scroll-operation at such Point of Sale terminal, thereby indicating that the subsequent user may not be the genuine or authorized user of the mobile device. In some embodiments, these multi-prone determinations may further be augmented with, or matched or correlated with, application-specific data or application-specific determinations, in order to improve the tailoring of the behavioral traits to the specific user. For example, the mobile device may differentiate and determine that the genuine user typically holds the phone vertically (e.g., anywhere, or in a particular location or type of location) when utilizing the camera application of the mobile device, but typically holds the phone horizontally (e.g., anywhere, or in that particular location or type of location) when utilizing the address book application of the mobile device; and these user-specific traits may be extracted and subsequently compared to data captured in a subsequent usage session of that mobile device, to authenticate user identity.


The present invention may be used in conjunction with various suitable devices and systems, for example, various devices that have a touch-screen; an ATM; a kiosk machine or vending machine that has a touch-screen; a touch-keyboard; a system that utilizes Augmented Reality (AR) components or AR glasses (e.g., Google Glass); a device or system that may detect hovering gestures that do not necessarily touch on the screen or touch-screen; a hovering screen; a system or device that utilize brainwave analysis or brainwave control in which the user's brainwaves are captured or read and the user's brain may directly control an application on the mobile device; and/or other suitable devices or systems.


Although portions of the discussion herein relate, for demonstrative purposes, to wired links and/or wired communications, some embodiments of the present invention are not limited in this regard, and may include one or more wired or wireless links, may utilize one or more components of wireless communication, may utilize one or more methods or protocols of wireless communication, or the like. Some embodiments may utilize wired communication and/or wireless communication.


Functions, operations, components and/or features described herein with reference to one or more embodiments of the present invention, may be combined with, or may be utilized in combination with, one or more other functions, operations, components and/or features described herein with reference to one or more other embodiments of the present invention.


While certain features of the present invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents may occur to those skilled in the art. Accordingly, the claims are intended to cover all such modifications, substitutions, changes, and equivalents.

Claims
  • 1. A method for authenticating identity of a user of an electronic device, the method comprising: receiving sensor data, which comprises: (i) user gesture data based on a user gesture; and (ii) device orientation data from a gyroscope of the electronic device; and (iii) device acceleration data from an accelerometer of the electronic device; determining a relation among the sensor data;
  • 2. The method of claim 1, wherein said passive data collection comprises capturing the user gesture data based on the user gesture without posing an active challenge to said user.
  • 3. The method of claim 1, wherein the user gesture data comprises: touch data from a touch-screen of the electronic device;wherein determining the relation among the sensor data comprises: determining a relation among (i) touch data, and (ii) device orientation data, and (iii) device acceleration data.
  • 4. The method of claim 1, wherein the reference value of the user-specific trait is stored, locally within the electronic device and/or on a remote server.
  • 5. The method of claim 1, wherein the reference value of the user-specific trait is stored, locally within the electronic device and/or on a remote server, without requesting active ad hoc user confirmation for storing said user-specific trait.
  • 6. The method of claim 1, wherein the reference value of the user-specific trait is stored locally within the electronic device;wherein determining whether or not the current user of the electronic device is an authorized user of the electronic device, is based on said reference value that is stored locally within the electronic device.
  • 7. The method of claim 1, wherein the reference value of the user-specific trait is stored remotely on a remote server that is external to said electronic device;wherein determining whether or not the current user of the electronic device is an authorized user of the electronic device, is based on said reference value that is stored remotely on said remote server.
  • 8. The method of claim 1, wherein the method further comprises:determining an offset of holding the electronic device in a hand of the user, wherein the offset comprises one or more of: (i) the electronic device being held with a palm area of the hand, (ii) the electronic device being held with a fingers area of the hand;wherein generating the user-specific trait is further based on a relation among the sensor data and the offset of holding the electronic device in the hand,wherein the user-specific trait further reflects the matter in which the user is holding the electronic device in the hand.
  • 9. The method of claim 1, wherein the method further comprises:constructing a user-specific profile based on said relation among the sensor data, wherein the constructing of the user-specific profile is performed over a pre-defined time-period;dynamically shortening the pre-defined time period for constructing the user-specific profile if one or more identified traits of the user are distinctive as user-specific.
  • 10. The method of claim 1, wherein the method further comprises:constructing a user-specific profile based on said relation among the sensor data, wherein the constructing of the user-specific profile is performed within a constraint selected from one or more of: (I) a pre-defined time-period, (II) a pre-defined number of user interactions;dynamically modifying the constraint for constructing the user-specific profile, based on distinctiveness of one or more traits of the user.
  • 11. The method of claim 1, wherein the method further comprises:constructing a user-specific profile based on said relation among the sensor data, wherein the constructing of the user-specific profile is performed within a constraint selected from one or more of: (I) a pre-defined time-period, (II) a pre-defined number of user interactions;dynamically modifying the constraint for constructing the user-specific profile, based on distinctiveness of one or more traits of the user;storing a flag indicating whether the user-specific profile is either (i) under construction, or (ii) fully constructed.
  • 12. The method of claim 1, wherein the method comprises: determining a currently-used application of the electronic device that the user is currently utilizing on the electronic device;determining the relation among the sensor data and the currently-used application of the electronic device;wherein generating the user-specific trait is further based on a manner in which the user is orienting the electronic device while performing the user gesture and while the user is using the currently-used application.
  • 13. The method of claim 12, wherein the method further comprises:based on said relation, (A) determining that the user typically holds the electronic device vertically when utilizing a first particular application of the electronic device, and (B) determining that the user typically holds the electronic device slanted relative to the ground when utilizing a second particular application of the electronic device;wherein generating the user-specific trait is further based on (I) determination that the user typically holds the electronic device vertically when utilizing the first particular application of the electronic device, and on (II) determination that the user typically holds the electronic device slanted relative to the ground when utilizing the second particular application of the electronic device, while performing the user gesture and while the user is using the currently-used application.
  • 14. The method of claim 1, wherein the method further comprises:determining whether a current location of the electronic device is outdoors or indoors;generating the user-specific trait is further based on a manner in which the user is orienting the electronic device while performing the user gesture and while the current location of the electronic device being either outdoors or indoors.
  • 15. The method of claim 1, wherein the sensor data further comprises a pressure exerted on the touch-screen by a body part of the user;wherein generating the user-specific trait is further based on a manner in which the user is orienting the electronic device while touching the touch-screen and while exerting the pressure on the touch-screen by the body part of the user.
  • 16. The method of claim 1, wherein the sensor data further comprises a current location of the electronic device;wherein generating the user-specific trait is further based on a manner in which the user is both accelerating and orienting the electronic device while also touching the touch-screen of the electronic device and while the electronic device is located at said current location.
  • 17. The method of claim 1, wherein generating the user-specific trait is further based on a manner in which the user is accelerating the electronic device during touching a touch-screen of the electronic device.
  • 18. The method of claim 1, wherein generating the user-specific trait is further based on a manner in which the user is accelerating the electronic device immediately prior to touching a touch-screen of the electronic device.
  • 19. The method of claim 1, wherein generating the user-specific trait is further based on a manner in which the user is accelerating the electronic device immediately subsequent to touching a touch-screen of the electronic device.
  • 20. The method of claim 1, wherein generating the user-specific trait is further based on a manner in which the user is accelerating the electronic device (I) immediately prior to touching a touch-screen of the electronic device, and (II) during touching the touch-screen of the electronic device, and (III) immediately subsequent to touching the touch-screen of the electronic device.
  • 21. The method of claim 1, wherein generating the user-specific trait is further based on a manner in which the user is orienting the electronic device during touching a touch-screen of the electronic device.
  • 22. The method of claim 1, wherein generating the user-specific trait is further based on a manner in which the user is orienting the electronic device immediately prior to touching a touch-screen of the electronic device.
  • 23. The method of claim 1, wherein generating the user-specific trait is further based on a manner in which the user is orienting the electronic device immediately subsequent to touching a touch-screen of the electronic device.
  • 24. The method of claim 1, wherein generating the user-specific trait is further based on a manner in which the user is orienting the electronic device (I) immediately prior to touching a touch-screen of the electronic device, and (II) during touching the touch-screen of the electronic device, and (III) immediately subsequent to touching the touch-screen of the electronic device.
  • 25. The method of claim 1, wherein the method further comprises: determining whether (I) the same hand of the user is utilized for both holding the electronic device and tapping the touch-screen of the electronic device, or (II) a first hand of the user is utilized for holding the electronic device and a second hand of the user is utilized for tapping the touch-screen of the electronic device;wherein generating the user-specific trait is further based on a manner in which the user is holding the electronic device while also tapping the touch-screen of the electronic device.
  • 26. The method of claim 1, wherein the method further comprises: determining that when the user performs the user gesture at a particular geometric place of the touch-screen of the electronic device, a first body part of the user is moving while a second body part of the user is at rest;wherein generating the user-specific trait is further based on a manner in which the first body part of the user is moving, while the second body part of the user is at rest, and while the user also performs the user gesture at the particular geometric place of the touch-screen.
  • 27. The method of claim 1, wherein the method further comprises: determining that for a scrolling gesture, which is performed on the touch-screen of the electronic device, a first hand-region of the user is moving while a second hand-region of the user is at rest;wherein generating the user-specific trait is further based on a manner in which the first hand-region of the user is moving, while the second hand-region of the user is at rest, and while the user is also performing the scrolling gesture on the touch-screen.
  • 28. The method of claim 1, wherein the method further comprises: performing analysis of touch-data of a swipe gesture performed by the user on the touch-screen of the electronic device; and based on said analysis, determining an estimated width of a finger of the user;wherein generating the user-specific trait is further based on the estimated width of the finger of the user.
  • 29. The method of claim 1, wherein the method further comprises:performing analysis of touch-data of a circular swipe gesture performed by the user on the touch-screen of the electronic device; and based on said analysis, determining an estimated distance between (I) a tip of a swiping finger of a hand of the user, and (II) a palm of the hand of the user;wherein generating the user-specific trait is further based on the estimated distance between (I) the tip of the swiping finger of the hand of the user, and (II) the palm of the hand of the user.
  • 30. The method of claim 1, wherein the method further comprises:performing analysis of touch-data of swipe gestures performed by the user on the touch-screen of the electronic device;based on said analysis, determining that the user typically rotates the electronic device clockwise while performing swipe gestures;wherein generating the user-specific trait is further based on a manner in which the user typically rotates the electronic device clockwise while also performing the swipe gestures.
  • 31. The method of claim 1, wherein the method further comprises:performing analysis of touch-data of swipe gestures performed by the user on the touch-screen of the electronic device;based on said analysis, determining that the user typically rotates the electronic device counter-clockwise while performing swipe gestures;wherein generating the user-specific trait is further based on a manner in which the user typically rotates the electronic device counter-clockwise while also performing the swipe gestures.
  • 32. The method of claim 1, wherein the method further comprises:analyzing the sensor data to determine a level of shakiness of the electronic device while the user operates the electronic device;analyzing the sensor data to determine an effect of the user-gesture on the level of shakiness of the electronic device;wherein generating the user-specific trait is further based on the effect of the user-gesture on the level of shakiness of the electronic device.
  • 33. The method of claim 1, wherein the method comprises:based on the relation among the sensor data,(a) determining that the user typically places the electronic device horizontally on a flat surface when utilizing the electronic device in a first geographic location, and(b) determining that the user typically holds the electronic device slanted relative to the ground when utilizing the electronic device in a second geographic location, andwherein generating the user-specific trait is further based on determinations (I) that the user typically places the electronic device horizontally on a flat surface when utilizing the electronic device in the first geographic location and (II) that the user typically holds the electronic device slanted relative to the ground when utilizing the electronic device in the second geographic location.
  • 34. The method of claim 1, wherein the sensor data comprises: (i) user gesture data; and (ii) device orientation data from said gyroscope of the electronic device; (iii) device acceleration data from an accelerometer of the electronic device, (iv) device geo-location data from a Global Positioning System (GPS) unit of the electronic device;wherein determining a relation among the sensor data comprises determining a user-specific relation among (i) user gesture data, and (ii) device orientation data, and (iii) device acceleration data, and (iv) device geo-location data.
  • 35. The method of claim 1, wherein the user gesture data comprises: non-tactile touch data indicating a hovering user gesture without touching a touch-screen of the electronic device;wherein determining the relation among the sensor data comprises: determining a relation among (i) non-tactile touch data indicating the hovering user gesture, and (ii) device orientation data, and (iii) device acceleration data.
  • 36. A method for authenticating identity of a user of an electronic device, the method comprising: receiving sensor data, which comprises: (i) user gesture data based on a user gesture; and (ii) device orientation data from a gyroscope of the electronic device; determining a relation among the sensor data;based on the relation among the sensor data, generating a user-specific trait indicative of the user of the electronic device, wherein the user-specific trait reflects a manner in which the user is orienting and accelerating the electronic device while performing the user gesture;storing a reference value of the user-specific trait; in a subsequent usage session:generating a current value of the user-specific trait, wherein at least a portion of the sensor data forming the basis of the relation among sensor data is received by passive data collection, andbased on a comparison process between (I) the current value of the user-specific trait that was generated, and (II) the reference value of the user-specific trait that was previously generated, determining whether or not a current user of the electronic device is an authorized user of the electronic device;wherein the method comprises:based on the relation among the sensor data, (A) determining that a first physiological region of the user moves when performing the user gesture, and (B) determining that a second, different, physiological region of the user does not move when performing the user gesture;wherein generating the user-specific trait is further based on a manner in which the user moves the first physiological region while performing the user gesture and does not move the second physiological region while performing the user gesture.
  • 37. The method of claim 36, wherein said passive data collection comprises capturing the user gesture data based on the user gesture without posing an active challenge to said user.
  • 38. The method of claim 36, wherein the user gesture data comprises: touch data from a touch-screen of the electronic device;wherein determining the relation among the sensor data comprises: determining a relation among (i) touch data, and (ii) device orientation data.
  • 39. A method for authenticating identity of a user of an electronic device, the method comprising: receiving sensor data, which comprises: (i) user gesture data based on a user gesture; and (ii) device acceleration data from an accelerometer of the electronic device;determining a relation among the sensor data;based on the relation among the sensor data, generating a user-specific trait indicative of the user of the electronic device, wherein the user-specific trait reflects a manner in which the user is orienting and accelerating the electronic device while performing the user gesture;storing a reference value of the user-specific trait; in a subsequent usage session:
  • 40. The method of claim 39, wherein said passive data collection comprises capturing the user gesture data based on the user gesture without posing an active challenge to said user.
  • 41. The method of claim 39, wherein the user gesture data comprises: touch data from a touch-screen of the electronic device;wherein determining the relation among the sensor data comprises: determining a relation among (i) touch data, and (ii) device acceleration data.
  • 42. A process comprising: authenticating identity of a user of an electronic device, based at least partially on data captured from sensors of said electronic device in a passive manner without posing an active challenge to the user, by performing: receiving sensor data, which comprises: (i) user gesture data based on a user gesture; and (ii) device orientation data from a gyroscope of the electronic device; and (iii) device acceleration data from an accelerometer of the electronic device; determining a relation among the sensor data;based on the relation among the sensor data, generating a user-specific trait indicative of the user of the electronic device, wherein the user-specific trait reflects a manner in which the user is orienting and accelerating the electronic device while performing the user gesture;storing a reference value of the user-specific trait; in a subsequent usage session,generating and storing a current value of the user-specific trait; and based on a comparison process between (I) the current value of the user-specific trait that was generated, and (II) the reference value of the user-specific trait that was previously generated, determining whether or not a current user of the electronic device is an authorized user of the electronic device; wherein the process comprises:based on the relation among the sensor data, (A) determining that a first physiological region of the user moves when performing the user gesture, and (B) determining that a second, different, physiological region of the user does not move when performing the user gesture;wherein generating the user-specific trait is further based on a manner in which the user moves the first physiological region while performing the user gesture and does not move the second physiological region while performing the user gesture.
  • 43. The process of claim 42, wherein the user gesture data comprises: touch data from a touch-screen of the electronic device;wherein determining the relation among the sensor data comprises: determining a relation among (i) touch data, and (ii) device orientation data, and (iii) device acceleration data.
  • 44. The process of claim 42, wherein at least a portion of the sensor data forming the basis of the relation among sensor data is received by passive data collection from said sensors of said electronic device in a passive manner without posing an active challenge to said user.
CROSS-REFERENCE TO RELATED APPLICATIONS

This patent application is a Continuation-in-Part (CIP) of U.S. Ser. No. 15/708,155, filed on Sep. 19, 2017, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 15/708,155 is a Continuation-In-Part (CIP) of U.S. Ser. No. 15/422,479, filed on Feb. 2, 2017, now U.S. Pat. No. 9,779,423, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 15/422,479 claims priority and benefit from U.S. 62/312,140, filed on Mar. 23, 2016, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 15/422,479 is also a Continuation-in-Part (CIP) of U.S. Ser. No. 15/276,803, filed Sep. 27, 2016, now U.S. Pat. No. 10,055,560, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 15/276,803 is a Continuation-in-Part (CIP) of U.S. Ser. No. 14/325,398, filed on Jul. 8, 2014, now U.S. Pat. No. 9,477,826, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 14/325,398 claims priority and benefit from U.S. 61/843,915, filed on Jul. 9, 2013, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 14/325,398 is a Continuation-in-Part (CIP) of U.S. Ser. No. 13/922,271, filed on Jun. 20, 2013, now U.S. Pat. No. 8,938,787, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 14/325,398 is a Continuation-in-Part (CIP) of U.S. Ser. No. 13/877,676, filed on Apr. 4, 2013, now U.S. Pat. No. 9,069,942; which was a National Phase of PCT International Application number PCT/IL2011/000907, filed on Nov. 29, 2011; which claimed priority and benefit from U.S. 61/417,479, filed on Nov. 29, 2010; all of which are hereby incorporated by reference in their entirety. The above-mentioned U.S. Ser. No. 14/325,398 is a Continuation-in-Part (CIP) of U.S. Ser. No. 14/320,653, filed on Jul. 1, 2014, now U.S. Pat. No. 9,275,337, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 14/325,398 is a Continuation-in-Part (CIP) of U.S. Ser. No. 14/320,656, filed on Jul. 1, 2014, now U.S. Pat. No. 9,665,703, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 15/422,479 is also a Continuation-in-Part (CIP) of U.S. Ser. No. 15/210,221, filed Jul. 14, 2016, now U.S. Pat. No. 9,674,218, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 15/210,221 is a Continuation of U.S. Ser. No. 14/675,768, filed on Apr. 1, 2015, now U.S. Pat. No. 9,418,221, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 14/675,768 claims priority and benefit from U.S. 61/973,855, filed on Apr. 2, 2014, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 14/675,768 is a Continuation-in-Part (CIP) of U.S. Ser. No. 14/566,723, filed on Dec. 11, 2014, now U.S. Pat. No. 9,071,969; which is a Continuation of U.S. Ser. No. 13/922,271, filed on Jun. 20, 2013, now U.S. Pat. No. 8,938,787; which is a Continuation-in-Part (CIP) of U.S. Ser. No. 13/877,676, filed on Apr. 4, 2013, now U.S. Pat. No. 9,069,942; which is a National Stage of PCT International Application number PCT/IL2011/000907, having an International Filing Date of Nov. 29, 2011; which claims priority and benefit from U.S. 61/417,479, filed on Nov. 29, 2010; all of which are hereby incorporated by reference in their entirety. This application is also a Continuation-in-Part (CIP) of U.S. Ser. No. 15/368,608, filed on Dec. 4, 2016, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 15/368,608 is a Continuation-in-Part (CIP) of U.S. Ser. No. 15/001,259, filed on Jan. 20, 2016, now U.S. Pat. No. 9,541,995; which is a Continuation of U.S. Ser. No. 14/320,653, filed on Jul. 1, 2014, now U.S. Pat. No. 9,275,337; all of which are hereby incorporated by reference in their entirety. The above-mentioned U.S. Ser. No. 14/320,653 claims priority and benefit from U.S. 61/843,915, filed on Jul. 9, 2013, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 14/320,653 is also a Continuation-in-Part (CIP) of U.S. Ser. No. 13/922,271, filed on Jun. 20, 2013, now U.S. Pat. No. 8,938,787, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 14/320,653 is also a Continuation-in-Part (CIP) of U.S. Ser. No. 13/877,676, filed on Apr. 4, 2013, now U.S. Pat. No. 9,069,942, which was a National Phase of PCT International Application number PCT/IL2011/000907, filed on Nov. 29, 2011, which claimed priority and benefit from U.S. 61/417,479, filed on Nov. 29, 2010. All of the above-mentioned patent applications are hereby incorporated by reference in their entirety. The above-mentioned U.S. Ser. No. 15/368,608 is also a Continuation-in-Part (CIP) of U.S. patent application Ser. No. 14/727,873, filed on Jun. 2, 2015, now U.S. Pat. No. 9,526,006, which is hereby incorporated by reference in its entirety. The above-mentioned U.S. Ser. No. 15/368,608 is also a Continuation-in-Part (CIP) of U.S. Ser. No. 15/360,291, filed on Nov. 23, 2016, now U.S. Pat. No. 9,747,436; which is a Continuation-in-Part (CIP) of U.S. Ser. No. 14/718,096, filed on May 21, 2015, now U.S. Pat. No. 9,531,701; all of which are hereby incorporated by reference in their entirety.

US Referenced Citations (413)
Number Name Date Kind
3618019 Nemirovsky Nov 1971 A
3699517 Dyche Oct 1972 A
3983535 Herbst Sep 1976 A
4128829 Herbst Dec 1978 A
4621334 Garcia Nov 1986 A
4760386 Heath Jul 1988 A
4805222 Young Feb 1989 A
5305238 Starr Apr 1994 A
5442342 Kung Aug 1995 A
5485171 Copper Jan 1996 A
5557686 Brown Sep 1996 A
5565657 Merz Oct 1996 A
5581261 Hickman Dec 1996 A
5838306 O'Connor et al. Nov 1998 A
5874941 Yamada Feb 1999 A
5999162 Takahashi Dec 1999 A
6202023 Hancock Mar 2001 B1
6337686 Wong Jan 2002 B2
6337919 Dunton Jan 2002 B1
6442692 Zilberman Aug 2002 B1
6572014 Lambert Jun 2003 B1
6819219 Bolle Nov 2004 B1
6836554 Bolle Dec 2004 B1
6895514 Kermani May 2005 B1
6931131 Becker Aug 2005 B1
6938061 Rumynin Aug 2005 B1
6938159 O'Connor Aug 2005 B1
6957185 Labaton Oct 2005 B1
6957186 Guheen Oct 2005 B1
6983061 Ikegami Jan 2006 B2
7092926 Cerrato Aug 2006 B2
7130452 Bolle Oct 2006 B2
7133792 Murakami Nov 2006 B2
7139916 Billingsley Nov 2006 B2
7158118 Liberty Jan 2007 B2
7236156 Liberty Jun 2007 B2
7245218 Ikehara Jul 2007 B2
7366919 Sobel Apr 2008 B1
7395436 Nemovicher Jul 2008 B1
7494061 Reinhold Feb 2009 B2
7523191 Thomas Apr 2009 B1
7535456 Liberty May 2009 B2
7606915 Calinov Oct 2009 B1
7796013 Murakami Sep 2010 B2
7818290 Davis Oct 2010 B2
7860870 Sadagopan Dec 2010 B2
8031175 Rigazio Oct 2011 B2
8065624 Morin Nov 2011 B2
8125312 Orr Feb 2012 B2
8156324 Shnowske Apr 2012 B1
8201222 Inoue Jun 2012 B2
8244211 Clark Aug 2012 B2
8285658 Kellas-Dicks Oct 2012 B1
8417960 Takahashi Apr 2013 B2
8433785 Awadallah Apr 2013 B2
8449393 Sobel May 2013 B2
8499245 Froment Jul 2013 B1
8510113 Conkie Aug 2013 B1
8548208 Schultz Oct 2013 B2
8549629 Mccreesh Oct 2013 B1
8555077 Davis Oct 2013 B2
8745729 Poluri Jun 2014 B2
8788838 Fadell Aug 2014 B1
8803797 Scott Aug 2014 B2
8819812 Weber Aug 2014 B1
8832823 Boss Sep 2014 B2
8838060 Walley Sep 2014 B2
8880441 Chen Nov 2014 B1
8938787 Turgeman Jan 2015 B2
8941466 Bayram Jan 2015 B2
8990959 Zhu Mar 2015 B2
9069942 Turgeman Jun 2015 B2
9071969 Turgeman Jun 2015 B2
9154534 Gayles Oct 2015 B1
9174123 Nasiri Nov 2015 B2
9195351 Rosenberg Nov 2015 B1
9275337 Turgeman Mar 2016 B2
9282112 Filatov Mar 2016 B2
9301140 Costigan Mar 2016 B1
9304915 Adams Apr 2016 B2
9418221 Turgeman Aug 2016 B2
9450971 Turgeman Sep 2016 B2
9477826 Turgeman Oct 2016 B2
9483292 Turgeman Nov 2016 B2
9526006 Turgeman Dec 2016 B2
9529987 Deutschmann Dec 2016 B2
9531701 Turgeman Dec 2016 B2
9531733 Turgeman Dec 2016 B2
9536071 Turgeman Jan 2017 B2
9541995 Turgeman Jan 2017 B2
9547766 Turgeman Jan 2017 B2
9552470 Turgeman Jan 2017 B2
9558339 Turgeman Jan 2017 B2
9589120 Samuel Mar 2017 B2
9621567 Turgeman Apr 2017 B2
9626677 Turgeman Apr 2017 B2
9665703 Turgeman May 2017 B2
9674218 Turgeman Jun 2017 B2
9690915 Turgeman Jun 2017 B2
9703953 Turgeman Jul 2017 B2
9712558 Turgeman Jul 2017 B2
9747436 Turgeman Aug 2017 B2
9779423 Turgeman Oct 2017 B2
9838373 Turgeman Dec 2017 B2
9848009 Turgeman Dec 2017 B2
9927883 Lin Mar 2018 B1
10032010 Turgeman Jul 2018 B2
10037421 Turgeman Jul 2018 B2
10049209 Turgeman Aug 2018 B2
10055560 Turgeman Aug 2018 B2
10069837 Turgeman Sep 2018 B2
10069852 Turgeman Sep 2018 B2
10079853 Turgeman Sep 2018 B2
10083439 Turgeman Sep 2018 B2
10164985 Turgeman Dec 2018 B2
10198122 Turgeman Feb 2019 B2
10262324 Turgeman Apr 2019 B2
10298614 Turgeman May 2019 B2
20010004733 Eldering Jun 2001 A1
20020023229 Hangai Feb 2002 A1
20020089412 Heger Jul 2002 A1
20030033526 French Feb 2003 A1
20030074201 Grashey Apr 2003 A1
20030137494 Tulbert Jul 2003 A1
20030149803 Wilson Aug 2003 A1
20030212811 Thornton Nov 2003 A1
20040015714 Abraham Jan 2004 A1
20040017355 Shim Jan 2004 A1
20040021643 Hoshino Feb 2004 A1
20040034784 Fedronic Feb 2004 A1
20040062423 Doi Apr 2004 A1
20040111523 Hall Jun 2004 A1
20040123156 Hammond Jun 2004 A1
20040143737 Teicher Jul 2004 A1
20040186882 Ting Sep 2004 A1
20040221171 Ahmed Nov 2004 A1
20050008148 Jacobson Jan 2005 A1
20050060138 Wang Mar 2005 A1
20050179657 Russo Aug 2005 A1
20050289264 Illowsky Dec 2005 A1
20060006803 Huang Jan 2006 A1
20060080263 Willis Apr 2006 A1
20060090073 Steinberg Apr 2006 A1
20060123101 Buccella Jun 2006 A1
20060143454 Walmsley Jun 2006 A1
20060195328 Abraham Aug 2006 A1
20060215886 Black Sep 2006 A1
20060224898 Ahmed Oct 2006 A1
20060282660 Varghese Dec 2006 A1
20060284969 Kim Dec 2006 A1
20070118804 Raciborski May 2007 A1
20070156443 Gurvey Jul 2007 A1
20070174082 Singh Jul 2007 A1
20070183633 Hoffmann Aug 2007 A1
20070214426 Ruelle Sep 2007 A1
20070236330 Cho Oct 2007 A1
20070240230 O'Connell Oct 2007 A1
20070250920 Lindsay Oct 2007 A1
20070255821 Ge Nov 2007 A1
20070266305 Cong Nov 2007 A1
20070271466 Mak Nov 2007 A1
20070283416 Renaud Dec 2007 A1
20080046982 Parkinson Feb 2008 A1
20080059474 Lim Mar 2008 A1
20080068343 Hoshino Mar 2008 A1
20080084972 Burke Apr 2008 A1
20080091639 Davis Apr 2008 A1
20080092209 Davis Apr 2008 A1
20080092245 Alward Apr 2008 A1
20080097851 Bemmel Apr 2008 A1
20080098456 Alward Apr 2008 A1
20080120717 Shakkarwar May 2008 A1
20080136790 Hio Jun 2008 A1
20080162449 Chao-Yu Jul 2008 A1
20080183745 Cancel Jul 2008 A1
20080192005 Elgoyhen Aug 2008 A1
20080200310 Tagliabue Aug 2008 A1
20080211766 Westerman Sep 2008 A1
20080215576 Zhao Sep 2008 A1
20080263636 Gusler Oct 2008 A1
20080298588 Shakkarwar Dec 2008 A1
20080301808 Calo Dec 2008 A1
20090037983 Chiruvolu Feb 2009 A1
20090038010 Ma Feb 2009 A1
20090089879 Wang Apr 2009 A1
20090094311 Awadallah Apr 2009 A1
20090132395 Lam May 2009 A1
20090157792 Fiatal Jun 2009 A1
20090172551 Kane Jul 2009 A1
20090189736 Hayashi Jul 2009 A1
20090199296 Xie Aug 2009 A1
20090203355 Clark Aug 2009 A1
20090227232 Matas Sep 2009 A1
20090241188 Komura Sep 2009 A1
20090254336 Dumais Oct 2009 A1
20090281979 Tysowski Nov 2009 A1
20090293119 Jonsson Nov 2009 A1
20090320123 Yu Dec 2009 A1
20100007632 Yamazaki Jan 2010 A1
20100040293 Hermann Feb 2010 A1
20100042387 Gibbon Feb 2010 A1
20100042403 Chandrasekar Feb 2010 A1
20100046806 Baughman Feb 2010 A1
20100070405 Joa Mar 2010 A1
20100077470 Kozat Mar 2010 A1
20100082747 Yue Apr 2010 A1
20100082998 Kohavi Apr 2010 A1
20100097324 Anson Apr 2010 A1
20100115610 Tredoux May 2010 A1
20100122082 Deng May 2010 A1
20100125816 Bezos May 2010 A1
20100138370 Wu Jun 2010 A1
20100164897 Morin Jul 2010 A1
20100171753 Kwon Jul 2010 A1
20100197352 Runstedler Aug 2010 A1
20100225443 Bayram Sep 2010 A1
20100245553 Schuler Sep 2010 A1
20100269165 Chen Oct 2010 A1
20100281539 Burns Nov 2010 A1
20100287229 Hauser Nov 2010 A1
20100321304 Rofougaran Dec 2010 A1
20100328074 Johnson Dec 2010 A1
20110010209 McNally Jan 2011 A1
20110012829 Yao Jan 2011 A1
20110016320 Bergsten Jan 2011 A1
20110016534 Jakobsson Jan 2011 A1
20110018828 Wu Jan 2011 A1
20110023115 Wright Jan 2011 A1
20110029902 Bailey Feb 2011 A1
20110039602 McNamara Feb 2011 A1
20110043475 Rigazio Feb 2011 A1
20110050394 Zhang Mar 2011 A1
20110063211 Hoerl Mar 2011 A1
20110065504 Dugan Mar 2011 A1
20110066682 Aldunate Mar 2011 A1
20110102570 Wilf May 2011 A1
20110105859 Popovic May 2011 A1
20110113388 Eisen May 2011 A1
20110154273 Aburada Jun 2011 A1
20110154497 Bailey, Jr. Jun 2011 A1
20110159650 Shiraishi Jun 2011 A1
20110159850 Faith Jun 2011 A1
20110162076 Song Jun 2011 A1
20110191820 Ivey Aug 2011 A1
20110193737 Chiueh Aug 2011 A1
20110202453 Issa Aug 2011 A1
20110221684 Rydenhag Sep 2011 A1
20110223888 Esaki Sep 2011 A1
20110225644 Pullikottil Sep 2011 A1
20110246902 Tsai Oct 2011 A1
20110248941 Abdo Oct 2011 A1
20110251823 Davis Oct 2011 A1
20110271342 Chung Nov 2011 A1
20110276414 Subbarao Nov 2011 A1
20110304531 Brooks Dec 2011 A1
20110320822 Lind Dec 2011 A1
20120005483 Patvarczki Jan 2012 A1
20120005719 McDougal Jan 2012 A1
20120007821 Zaliva Jan 2012 A1
20120054834 King Mar 2012 A1
20120096555 Mahaffey Apr 2012 A1
20120102551 Bidare Apr 2012 A1
20120113061 Ikeda May 2012 A1
20120124662 Baca May 2012 A1
20120133055 Machida May 2012 A1
20120151559 Koudys Jun 2012 A1
20120154173 Chang Jun 2012 A1
20120154273 McDade Jun 2012 A1
20120154823 Sakamoto Jun 2012 A1
20120158503 Mardikar Jun 2012 A1
20120159599 Szoke Jun 2012 A1
20120164978 Conti Jun 2012 A1
20120167170 Shi Jun 2012 A1
20120174213 Geiger Jul 2012 A1
20120188198 Jeong Jul 2012 A1
20120204257 O'Connell Aug 2012 A1
20120218193 Weber Aug 2012 A1
20120246737 Paxton Sep 2012 A1
20120252410 Williams Oct 2012 A1
20120278804 Narayanasamy Nov 2012 A1
20120284380 Anderson Nov 2012 A1
20130024239 Baker Jan 2013 A1
20130036416 Raju Feb 2013 A1
20130076650 Vik Mar 2013 A1
20130088434 Masuda Apr 2013 A1
20130097682 Zeljkovic Apr 2013 A1
20130097706 Titonis Apr 2013 A1
20130111586 Jackson May 2013 A1
20130133055 Ali May 2013 A1
20130135218 Jain May 2013 A1
20130139248 Rhee May 2013 A1
20130154999 Guard Jun 2013 A1
20130162603 Peng Jun 2013 A1
20130167212 Azar Jun 2013 A1
20130212674 Boger Aug 2013 A1
20130237272 Prasad Sep 2013 A1
20130239195 Turgeman Sep 2013 A1
20130239206 Draluk Sep 2013 A1
20130282637 Costigan Oct 2013 A1
20130288647 Turgeman Oct 2013 A1
20130305357 Ayyagari Nov 2013 A1
20130312097 Turnbull Nov 2013 A1
20130335349 Ferren Dec 2013 A1
20140033317 Barber Jan 2014 A1
20140041020 Zhao Feb 2014 A1
20140078061 Simons Mar 2014 A1
20140078193 Barnhoefer Mar 2014 A1
20140082369 Waclawsky Mar 2014 A1
20140111451 Park Apr 2014 A1
20140118520 Slaby May 2014 A1
20140123275 Azar May 2014 A1
20140143304 Hegarty May 2014 A1
20140168093 Lawrence Jun 2014 A1
20140196119 Hill Jul 2014 A1
20140200953 Mun Jul 2014 A1
20140250538 Rapaport Sep 2014 A1
20140259130 Li Sep 2014 A1
20140270571 Dwan Sep 2014 A1
20140283059 Sambamurthy Sep 2014 A1
20140317028 Turgeman Oct 2014 A1
20140317726 Turgeman Oct 2014 A1
20140317734 Valencia Oct 2014 A1
20140317744 Turgeman Oct 2014 A1
20140325223 Turgeman Oct 2014 A1
20140325645 Turgeman Oct 2014 A1
20140325646 Turgeman Oct 2014 A1
20140325682 Turgeman Oct 2014 A1
20140337786 Luo Nov 2014 A1
20140344927 Turgeman Nov 2014 A1
20150002479 Kawamura Jan 2015 A1
20150012920 De Santis Jan 2015 A1
20150062078 Christman Mar 2015 A1
20150091858 Rosenberg Apr 2015 A1
20150094030 Turgeman Apr 2015 A1
20150101031 Harjanto Apr 2015 A1
20150146945 Han May 2015 A1
20150205944 Turgeman Jul 2015 A1
20150205955 Turgeman Jul 2015 A1
20150205957 Turgeman Jul 2015 A1
20150205958 Turgeman Jul 2015 A1
20150212843 Turgeman Jul 2015 A1
20150213244 Lymberopoulos Jul 2015 A1
20150213246 Turgeman Jul 2015 A1
20150213251 Turgeman Jul 2015 A1
20150256528 Turgeman Sep 2015 A1
20150256556 Kaminsky Sep 2015 A1
20150264572 Turgeman Sep 2015 A1
20150268768 Woodhull Sep 2015 A1
20150310196 Turgeman Oct 2015 A1
20160006800 Summers Jan 2016 A1
20160034673 Chandra Feb 2016 A1
20160042164 Goldsmith Feb 2016 A1
20160077620 Choi Mar 2016 A1
20160109969 Keating Apr 2016 A1
20160132105 Turgeman May 2016 A1
20160164905 Pinney Wood Jun 2016 A1
20160164906 Pinney Wood Jun 2016 A1
20160174044 Jones Jun 2016 A1
20160179245 Johansson Jun 2016 A1
20160191237 Roth Jun 2016 A1
20160196414 Stuntebeck Jul 2016 A1
20160197918 Turgeman Jul 2016 A1
20160209948 Tulbert Jul 2016 A1
20160226865 Chen Aug 2016 A1
20160294837 Turgeman Oct 2016 A1
20160300054 Turgeman Oct 2016 A1
20160306974 Turgeman Oct 2016 A1
20160307191 Turgeman Oct 2016 A1
20160307201 Turgeman Oct 2016 A1
20160321445 Turgeman Nov 2016 A1
20160321689 Turgeman Nov 2016 A1
20160342826 Apostolos Nov 2016 A1
20160364138 Luo Dec 2016 A1
20160366177 Turgeman Dec 2016 A1
20160371476 Turgeman Dec 2016 A1
20170011217 Turgeman Jan 2017 A1
20170012988 Turgeman Jan 2017 A1
20170017781 Turgeman Jan 2017 A1
20170032114 Turgeman Feb 2017 A1
20170034210 Talmor Feb 2017 A1
20170048272 Yamamura Feb 2017 A1
20170054702 Turgeman Feb 2017 A1
20170076089 Turgeman Mar 2017 A1
20170085587 Turgeman Mar 2017 A1
20170090418 Tsang Mar 2017 A1
20170091450 Turgeman Mar 2017 A1
20170126735 Turgeman May 2017 A1
20170127197 Mulder May 2017 A1
20170140279 Turgeman May 2017 A1
20170149958 Xian May 2017 A1
20170154366 Turgeman Jun 2017 A1
20170193526 Turgeman Jul 2017 A1
20170195354 Kesin Jul 2017 A1
20170195356 Turgeman Jul 2017 A1
20170221064 Turgeman Aug 2017 A1
20170364919 Ranganath Dec 2017 A1
20180012227 Tunnell Jan 2018 A1
20180034850 Turgeman Feb 2018 A1
20180095596 Turgeman Apr 2018 A1
20180103047 Turgeman Apr 2018 A1
20180107836 Boger Apr 2018 A1
20180115899 Kedem Apr 2018 A1
20180121640 Turgeman May 2018 A1
20180160309 Turgeman Jun 2018 A1
20180314816 Turgeman Nov 2018 A1
20180349583 Turgeman Dec 2018 A1
20180351959 Turgeman Dec 2018 A1
20190028497 Karabchevsky Jan 2019 A1
20190057200 Sabag Feb 2019 A1
20190121956 Turgeman Apr 2019 A1
20190156034 Kedem May 2019 A1
20190158535 Kedem May 2019 A1
20190220863 Novick Jul 2019 A1
Foreign Referenced Citations (9)
Number Date Country
2410450 Jan 2012 EP
2477136 Jul 2012 EP
2610776 Jul 2013 EP
2646904 Aug 2018 EP
3019991 Feb 2019 EP
2338092 May 2010 ES
2005099166 Oct 2005 WO
2007146437 Dec 2007 WO
2012073233 Jun 2012 WO
Non-Patent Literature Citations (38)
Entry
International Search Report for PCT international application PCT/IL2018/051246, dated Mar. 11, 2019.
Written Opinion of the International Searching Authority for PCT international application PCT/IL2018/051246, dated Mar. 11, 2019.
Written Opinion of the International Searching Authority for PCT international application PCT/IL2011/000907, dated Apr. 19, 2012.
Written Opinion of the International Searching Authority for PCT international application PCT/IB2014/062293, dated Oct. 1, 2014.
Written Opinion of the International Searching Authority for PCT international application PCT/IB2014/062941, dated Dec. 17, 2014.
Written Opinion of the International Searching Authority for PCT international application PCT/IB2016/054064, dated Jul. 9, 2015.
Communication from the European Patent Office (EPO) in EP 14814408, dated Oct. 15, 2019.
Faisal Alkhateeb et al., “Bank Web Sites Phishing Detection and Notification System Based on Semantic Web technologies”, International Journal of Security and its Applications 6(4):53-66, Oct. 2012.
Sungzoon Cho et al., “Artificial Rhythms and Cues for Keystroke Dynamics Based Authentication”, International Conference on Biometrics (ICB)—Advances in Biometrics, pp. 626-632, year 2006.
International Search Report for PCT/IB2017/055995, dated Feb. 15, 2018.
Written Opinion of the International Search Authority for PCT/IB2017/055995, dated Feb. 15, 2018.
Supplementary European Search Report for application 11844440 dated Nov. 17, 2017.
International Search Report for application PCT/IB2016/054064 dated Nov. 21, 2016.
International Search Report for application PCT/IB2014/062941 dated Dec. 17, 2014.
International Search Report for application PCT/IB2014/062293 dated Oct. 1, 2014.
International Search Report for application PCT/IL2011/000907 dated Apr. 19, 2012.
Nakkabi et al., “Improving Mouse Dynamics Biometric Performance Using Variance Reduction via Extractors with Separate Features”, Nov. 2010, IEEE Transactions on System, Man, and Cybernetics; vol. 40, No. 6.
Nance et al., “Virtual Machine Introspection”, IEEE Security & Privacy, 2008.
Garfinkel and Rosenblum, “A virtual Machine Introspection-Based Architecture for Intrusion Detection.”, 2003, Proc. Network and Distributed Systems Security Symp., The Internet Society, pp. 191-206.
Spafford et al., “Software Forensics: Can We Track Code to its Authors?”, Feb. 1992, Computer Science Technical Report, Purdue e-Pubs, Report No. CSD-TR-92-010.
Tavis Ormandy, “An Empirical Study into the Security Exposure to Hosts of Hostile Virtualized Environments”, retrieved from the Internet on May 3, 2017, from: http://taviso.decsystem.org/virtsec.pdf.
Zhang et al., “An Efficient User Verification System via Mouse Movements”, Oct. 17-21, 2011, CCS' 11, Chicago, Illinois.
Liston et al., “On the Cutting Edge: Thwarting Virtual Machine Detection”; retrieved from the Intemet on May 3, 2017, from: http://docplayer.net/9791309-On-the-cutting-edge-thwarting-virtual-machine-detection.html
Georgia Frantzeskou et al., “Identifying Authorship by Byte-Level N-Grams: The source Code Author Profile (SCAP) Method”, Spring 2007, International Journal of Digital Evidence, vol. 6, issue 1.
Franklin et al., “Remote Detection of Virtual Machine Monitors with Fuzzy benchmarking”, ACM SIGOPS Operating Systems Review, V42, Issue 3, Apr. 2008.
Emmanouil Vasilomanolakis, “A honeypot-driven cyber incident monitor: Lessons learned and steps ahead”; Sep. 2015; SIN '15: Proceedings of the 8th International Conference on Security of Information and Networks; Publisher: ACM; pp. 1-7.
Ahmed et al., “A New Biometric Technology Based on Mouse Dynamics”, Jul.-Sep. 2007, IEEE Transactions on Dependable and Secure Computing, vol. 4, No. 3, pp. 165-179.
Bailey, Kyle O., “Computer Based Behavioral Biometric Authentication Via Multi-Modal Fusion”, Thesis, 2013, Air Force Insitute of Technology.
Elizabeth Stinson and John C. Mitchell, “Characterizing the Remote Control Behavior of Bots”, Detection of Intrusions and Malware, and Vulnerability Assessment. Springer Berlin Heidelberg, p. 89-108. Dec. 31, 2007.
Todorov, “Optimality Principles in Sensorimotor Control (Review)”, Sep. 2004, Nature Neuroscience 7, pp. 907-915.
Cleeff et al., “Security Implications of Virtualization: A Literature Study”, Science and Engineering, 2009.
Hibbeln et al., “Detecting Deception in Online Environments: Measuring Fraud Through Mouse Cursor Movements”, Jun. 7, 2014, Gmunden Retreat on NeurolS 2014 Gmunden Austria, p. 38.
Ferrie Peter, “Attack on Virtual Machine Emulators”, Symantec Technology Exchange, 2007.
Yampolskiy et al., “Behavioural Biometrics: a survey and classification”, 2008, International Journal of Biometrics, vol. 1, No. 1, pp. 81-113.
Provos et al., 2007, “The Ghost in the Browser: Analysis of Web-based Malware”.
Huang Yao-Wen et al., “Web application security assessment by fault injection and behavior monitoring”, 2003, Proceedings of the 12th international conference on World Wide Web, ACM.
Ben Hansen, “The Blur Busters Mouse Guide”, dated Feb. 1, 2014; printed from the Internet on Aug. 5, 2019 from: https://www.blurbusters.com/faq/mouse-guide/.
Chris Cain, “Analyzing Man-in-the-Browser (MITB) Attacks”, dated Dec. 2014; downloaded from the Internet on Aug. 5, 2019 from: https://www.sans.org/reading-room/whitepapers/forensics/analyzing-man-in-the-browser-mitb-attacks-35687.
Related Publications (1)
Number Date Country
20190272025 A1 Sep 2019 US
Provisional Applications (4)
Number Date Country
62312140 Mar 2016 US
61843915 Jul 2013 US
61417479 Nov 2010 US
61973855 Apr 2014 US
Continuations (1)
Number Date Country
Parent 14320653 Jul 2014 US
Child 15001259 US
Continuation in Parts (21)
Number Date Country
Parent 15708155 Sep 2017 US
Child 16416222 US
Parent 15422479 Feb 2017 US
Child 15708155 US
Parent 15276803 Sep 2016 US
Child 15422479 US
Parent 14325398 Jul 2014 US
Child 15276803 US
Parent 13922271 Jun 2013 US
Child 14325398 US
Parent 13877676 US
Child 13922271 US
Parent 14320653 Jul 2014 US
Child 14325398 Jul 2014 US
Parent 14320656 Jul 2014 US
Child 14320653 US
Parent 15210221 Jul 2016 US
Child 15422479 Feb 2017 US
Parent 14675768 Apr 2015 US
Child 15210221 US
Parent 14566723 Dec 2014 US
Child 14675768 US
Parent 13922271 US
Child 14566723 US
Parent 13877676 US
Child 13922271 US
Parent 16412222 May 2019 US
Child 13922271 US
Parent 15368608 Dec 2016 US
Child 16412222 US
Parent 15001259 Jan 2016 US
Child 15368608 US
Parent 13922271 US
Child 14320653 US
Parent 13877676 US
Child 13922271 US
Parent 14727873 Jun 2015 US
Child 15368608 Dec 2016 US
Parent 15360291 Nov 2016 US
Child 14727873 US
Parent 14718096 May 2015 US
Child 15360291 US