The present invention relates to computer security in general, and, more particularly, to authentication.
In some instances it is desirable for security reasons to require that the user of a data-processing system (e.g., a wireless telecommunications terminal such as a cellular phone or a smart phone, a personal computer, a server, etc.) be authenticated before the user is permitted to access an application or resource of the data-processing system. Typically a user is presented with an authentication challenge, and the user must supply a valid response to the challenge. Examples of different types of authentication challenges include:
The present invention enables authentication frequency (i.e., the length of time between authenticating and re-authenticating a user) and challenge type (e.g., username/password, fingerprint recognition, voice recognition, etc.) to be determined based on one or more environmental properties (e.g., ambient noise level, ambient luminosity, temperature, etc.), or one or more physiological properties of a user (e.g., heart rate, blood pressure, etc.), or both. In accordance with the illustrative embodiment, both current and historical environmental and physiological properties can be used in these determinations.
The present invention is advantageous in that it enables authentication frequency to be increased (i.e., less time between re-authentication challenges, which corresponds to tighter security) and the challenge type to be stronger (i.e., more secure) in situations where it is more likely that a malicious user has gained access to a data-processing system. For example, it might be more likely that a user's wireless telecommunications terminal (e.g., a cell phone, a personal digital assistant [PDA], etc.) is left behind or stolen in an environment with a lot of ambient noise (the theory being that the environment is a public place with a lot of people around). As another example, the authentication frequency and challenge type for an office worker's personal computer might be set for a higher level of security when the office is dark. As yet further examples, authentication frequency and challenge type for a wireless telecommunications terminal might be set for a higher level of security when one or more physiological properties of its user differ substantially from their normal prior ranges, or when the physiological properties indicate that the user might be nervous, or when the environment of the terminal at a particular day and time (say, a weekday morning) differs substantially from the norm.
In addition, the present invention enables the selection of an authentication challenge type that is especially well-suited to a particular situation. For example, a voice recognition challenge might be issued when it is dark, as opposed to a retina scan challenge (because it's dark), or a fingerprint recognition challenge (as it might be difficult for the user to find the fingerprint sensor in the dark). As another example, a challenge/response via a video display and keyboard might be more appropriate than a voice recognition challenge in a noisy environment. As yet another example, when physiological properties of a user suggest that the user is engaged in vigorous exercise, a speaker-independent challenge/response via a speaker and microphone might be more suitable than a display/keyboard challenge (as it is likely difficult for the user to type via keyboard while exercising) or a voice recognition challenge (as the error rate might be high when a user is breathing heavily).
The illustrative embodiment comprises: presenting a first authentication challenge at time t1; and presenting a second authentication challenge at time t2; wherein the magnitude of t2-t1 is based on an environmental property at one or more instants in time interval [t1, t2].
For the purposes of the specification and claims, the term “calendrical time” is defined as indicative of one or more of the following:
Transceiver 110 is capable of receiving external signals (e.g., via a wired network, via a wireless network, etc.) and forwarding information encoded in these signals to processor 180, and of receiving information from processor 180 and transmitting signals that encode this information (e.g., via a wired network, via a wireless network, etc.), in well-known fashion.
Memory 120 is capable of storing data, program source code, and executable instructions, as is well-known in the art, and might be any combination of random-access memory (RAM), flash memory, disk drive, etc. In accordance with the illustrative embodiment, memory 120 is capable of storing historical environmental and physiological data.
Clock 130 is capable of transmitting the current time, date, and day of the week to processor 180, in well-known fashion.
Input devices 140-1 through 140-N are capable of receiving input from a user and of forwarding the input to processor 180, in well-known fashion. Examples of input devices 140-1 through 140-N might include a numeric keypad, an alphanumeric keyboard, a fingerprint sensor, a microphone, a magnetic card reader, and so forth.
Output devices 150-1 through 150-M are capable of receiving information, including authentication challenges, from processor 180, and of outputting the information to a user, in well-known fashion. Examples of output devices 150-1 through 150-M might include a video display, a speaker, a vibration mechanism, and so forth.
Environmental sensor array 160 is capable of receiving information concerning environmental properties, as is described in detail below and with respect to
Physiological sensor array 170 is capable of receiving information concerning a user's physiological properties, as is described in detail below and with respect to
Processor 180 is a general-purpose processor that is capable of reading data from and writing data into memory 120, of executing instructions stored in memory 120, and of executing the tasks described below and with respect to
Sound level meter 210 measures ambient sound intensity, in well-known fashion, and transmits its measurements to processor 180.
Photometer 220 measures ambient light intensity, in well-known fashion, and transmits its measurements to processor 180.
Thermometer 230 measures ambient temperature, in well-known fashion, and transmits its measurements to processor 180.
Hygrometer 240 measures ambient humidity, in well-known fashion, and transmits its measurements to processor 180.
Barometer 250 measures ambient air pressure, in well-known fashion, and transmits its measurements to processor 180.
Heart rate monitor 210 measures a user's heart rate, in well-known fashion, and transmits its measurements to processor 180.
Blood pressure monitor 220 measures a user's blood pressure, in well-known fashion, and transmits its measurements to processor 180.
Respiration rate monitor 230 measures a user's respiration rate, in well-known fashion, and transmits its measurements to processor 180.
Body temperature monitor 240 measures a user's body temperature, in well-known fashion, and transmits its measurements to processor 180.
Brain activity monitor 250 is a device such as an electroencephalograph, an electromyograph, etc. that obtains one or more measurements of a user's brain activity and transmits its measurements to processor 180. As will be appreciated by those skilled in the art, in some embodiments of the present invention brain activity monitory 250 might be capable of indicating such conditions as when a user is engaged in deep thought, when a user is engaged in vigorous exercise, when a user is in a stupor, when a user is asleep, and so forth.
As will be appreciated by those skilled in the art, in some embodiments of the present invention physiological sensor array 170 might comprise other kinds of physiological monitors (e.g., an electrocardiograph, a pulse oximeter, etc.) and/or collect other physiological properties (e.g., heart beat, pulse regularity, skin color, etc.) in addition to, or instead of, those depicted in
At task 410, environmental properties at data-processing system 100 are obtained from sensor array 160, in well-known fashion.
At task 420, physiological properties of the user of data-processing system 100 are obtained from sensor array 170, in well-known fashion.
At task 430, the input capabilities of data-processing system 100 are determined. As will be appreciated by those skilled in the art, in embodiments of the present invention in which task 430 is performed by data-processing system 100 itself, data-processing system 100 merely has to check which of input devices 140-1 through 140-N are currently enabled and functional; while in some other embodiments of the present invention, an authentication server or some other entity might transmit a message to data-processing system 100 that explicitly asks for its input capabilities; while in yet some other embodiments, an authentication server or some other entity might transmit a message to data-processing system 100 that asks for its manufacturer and model (e.g., Apple Iphone®, etc.), and then consult a database to determine the input capabilities of data-processing system 100 (under the assumption that all of data-processing system 100's capabilities are currently enabled and functional).
At task 440, an authentication challenge type T and time A between challenges are determined based on:
As will be appreciated by those skilled in the art, in some embodiments of the present invention, an authentication challenge type might comprise a plurality of successive challenges, rather than a single challenge, thereby enabling even “stronger” authentication challenges. For example, a challenge type determined at task 440 might be “fingerprint recognition, followed by iris scan.”
As will further be appreciated by those skilled in the art, for embodiments of the present invention in which task 440 is performed by data-processing system 100, the current day and time might be obtained from clock 130, or might be obtained from an external source via transceiver 110. Moreover, although in the illustrative embodiment historical environmental and physiological property data are stored in memory 120, in some other embodiments of the present invention these data might be stored in an external database and accessed by data-processing system 100 via transceiver 110. As will further be appreciated by those skilled in the art, in some embodiments of the present invention the collection, storing, and organization of these historical data might be performed by data-processing system 100 itself, while in some other embodiments of the present invention some other entity might perform these functions.
As will further be appreciated in the art, in some embodiments of the present invention in which physiological sensor array 170 is capable of receiving signals from one or more other persons in addition to the current user of data-processing system 100, the physiological properties of these other persons might also be considered in the determination of task 440. Similarly, in some other embodiments of the present invention in which physiological sensor array 170 is capable of receiving signals from one or more other persons instead of the current user of data-processing system 100, the physiological properties of at least one of these other persons will be considered in lieu of physiological properties of the current user of data-processing system 100.
At task 450, an authentication challenge of type T is generated, in well-known fashion.
At task 460, the authentication challenge generated at task 450 is presented to the user of data-processing system 100 at a time in accordance with Δ, in well-known fashion. After task 460, the method of
It is to be understood that the disclosure teaches just one example of the illustrative embodiment and that many variations of the invention can easily be devised by those skilled in the art after reading this disclosure and that the scope of the present invention is to be determined by the following claims.
This application is a continuation-in-part of U.S. patent application Ser. No. 11/942,670, filed 19 Nov. 2007, entitled “Determining Authentication Challenge Timing And Type”, which is incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5768503 | Olkin | Jun 1998 | A |
6014085 | Patel | Jan 2000 | A |
6859651 | Gabor | Feb 2005 | B2 |
7024556 | Hadjinikitas et al. | Apr 2006 | B1 |
7120129 | Ayyagari et al. | Oct 2006 | B2 |
7237024 | Toomey | Jun 2007 | B2 |
7577987 | Mizrah | Aug 2009 | B2 |
7814324 | Blakley et al. | Oct 2010 | B2 |
7860486 | Frank et al. | Dec 2010 | B2 |
8009121 | Stuart et al. | Aug 2011 | B1 |
8027665 | Frank | Sep 2011 | B2 |
8370639 | Azar et al. | Feb 2013 | B2 |
8584200 | Frank | Nov 2013 | B2 |
20020152034 | Kondo et al. | Oct 2002 | A1 |
20020178359 | Baumeister et al. | Nov 2002 | A1 |
20040006710 | Pollutro et al. | Jan 2004 | A1 |
20050015592 | Lin | Jan 2005 | A1 |
20050097320 | Golan et al. | May 2005 | A1 |
20060089125 | Frank | Apr 2006 | A1 |
20070008937 | Mody et al. | Jan 2007 | A1 |
20070250920 | Lindsay | Oct 2007 | A1 |
20080146193 | Bentley et al. | Jun 2008 | A1 |
20080162338 | Samuels et al. | Jul 2008 | A1 |
20080189768 | Callahan et al. | Aug 2008 | A1 |
20090007229 | Stokes | Jan 2009 | A1 |
20090013381 | Torvinen et al. | Jan 2009 | A1 |
20090023422 | MacInnis et al. | Jan 2009 | A1 |
20090183248 | Tuyls et al. | Jul 2009 | A1 |
20090198820 | Golla et al. | Aug 2009 | A1 |
Entry |
---|
Audio-visual multimodal fusion for biometric person authentication and liveness verification; G Chetty et al; Proceedings of the 2005 NICTA-HCSNet Multimodal user interaction workshop 2006, 78 pages. |
“Enhancing security and privacy in biometrics-based authentication systems”; Ratha et al; IBM systems Journal, 2001 21 pages. |
Patel, Munjalkumar C., “U.S. Appl. No. 11/942,670 Restriction Requirement Oct. 29, 2010”, , Publisher: USPTO, Published in: US. |
Patel, Munjalkumar C., “U.S. Appl. No. 11/942,670 Office Action Feb. 3, 2011”, , Publisher: USPTO, Published in: US. |
Patel, Munjalkumar C., “U.S. Appl. No. 11/942,670 Office Action May 13, 2011”, , Publisher: USPTO, Published in: US. |
Number | Date | Country | |
---|---|---|---|
20090133106 A1 | May 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11942670 | Nov 2007 | US |
Child | 12241584 | US |