The underlying concepts, but not necessarily the language, of the following case are incorporated by reference:
U.S. patent application Ser. No. 11/832,574, filed 1 Aug. 2007, which is published as U.S. Patent Application Publication No. 2009/0037985. If there are any contradictions or inconsistencies in language between this application and the case that has been incorporated by reference that might affect the interpretation of the claims in this case, the claims in this case should be interpreted to be consistent with the language in this case.
The present invention relates to security in general, and, more particularly, to authentication.
Peer authentication is a method by which a first user is authenticated by a second user. (Note that, as in the term peer-to-peer communication, the word “peer” is used generically and has no connotation regarding the professional or social standing of the users.) An example of peer authentication is illustrated by the following familiar scenario: an employee in the lobby of a corporate building realizes that she accidentally left her corporate badge at home, and that therefore she will not be allowed to enter the building proper without some other means of authentication. She therefore approaches the guard in the lobby and tells the guard that she is an employee, but that she doesn't have her badge.
The guard then:
The officemate arrives at the lobby, and then either:
In a variation of the above scenario, the guard, rather than asking the officemate to come to the lobby, might ask the officemate to talk to the alleged employee over the phone. The officemate talks to the alleged employee over the phone, and determines whether the alleged employee is in fact who she claims to be, based on the telephone conversation (e.g., based on her voice, based on her answers to one or more questions, etc.). The officemate then informs the guard whether the alleged employee should be allowed to advance past the lobby.
The present invention provides a mechanism for orchestrating peer authentication during an ongoing electronic communication session, which we will term a call (e.g., a telephone call, a conference call between three or more parties, an instant messaging [IM] chat session, etc.). The mechanism is particularly useful in detecting malicious behavior that might occur during a conference call. For example, during an important business conference call—say, concerning the merger of two corporations—one of the participants might have to sneak out of his office momentarily for a bathroom break, during which a malicious user could come in to the office and overhear confidential information, or even impersonate the other person.
In accordance with the illustrative embodiment of the present invention, a user is first authenticated in order to participate in a call (e.g., via entering a password, etc.), and subsequently during the call the user may be peer authenticated. In particular, a user who participates in a call might be prompted to authenticate another user on the call based on particular events or user behavior during the call. For example, if a first user is silent for a given length of time during the call, a second user on the call (i.e., a “peer”) might be prompted to authenticate the first user (the theory being that, perhaps, another person has maliciously taken the first user's place and is passively listening to the call). As another example, if a first user is participating in a call via a wireless telecommunications terminal, a second user on the call might be prompted to authenticate the first user if the first user has entered a public area in which there is a greater likelihood of malicious behavior.
In accordance with the present invention, a peer might be prompted to authenticate a user in a variety of ways. In some embodiments, for example, a text message might appear on the display of the peer's terminal, asking the peer whether or not a particular user's voice sounds correct. Alternatively, in some other embodiments of the present invention, a text message might prompt the peer to (1) ask a particular user a question that only the actual user would know the answer to, and (2) enter a number between 1 and 5 indicating the peer's confidence in the user's identity.
When there are three or more users participating in a call, then in some embodiments of the present invention the selection of the peer might occur randomly, while in some other embodiments the selection might be based on a variety of criteria (e.g., based on an indication of how well the users know each other, as disclosed in U.S. patent application Ser. No. 11/832,574, which is published as U.S. Patent Application Publication No. 2009/0037985, incorporated by reference, etc.)
The illustrative embodiment comprises: presenting an authentication challenge to a first user who wishes to participate in a call; admitting the first user to the call when the authentication challenge is met; and generating during the call, after the admission of the first user, a signal that prompts a second user participating in the call to authenticate the first user.
For the purposes of this specification, the term “peer” is defined as a user. In accordance with the illustrative embodiment, two users are considered peers if they participate in a particular call; the term does not impose any constraints on the relative ranks, status, etc. of the users.
At task 110, user U attempts to participate in a call via his or her telecommunications terminal T, in well-known fashion.
At task 120, user U is presented with one or more authentication challenges (e.g., a username/password challenge, etc.), in well-known fashion.
At task 130, the method branches based on whether the authentication challenge(s) presented at task 120 were met by user U. If so, execution continues at task 140, otherwise, the method of
At task 140, user U is admitted to the call, and the current time and geo-location of terminal Tare recorded, in well-known fashion.
Task 150 determines whether user U should be peer authenticated based on a variety of criteria, including:
Some examples of rules concerning user U's speech might include:
Some examples of rules concerning terminal geo-locations and history might include:
At task 160, the method branches based on whether it was determined at task 150 to peer authenticate user U. If so, execution continues at task 170, otherwise, execution proceeds to task 185.
At task 170, peer authentication of user U is performed, as described in detail below and with respect to
At task 180, the method branches based on whether the verdict received by the peer is deemed to be “satisfactory”. As described below and with respect to task 230 of
If the verdict is satisfactory, execution continues at task 185, otherwise execution continues at task 195.
At task 185, the current geo-locations of the call participants' terminals are recorded, and information about user U's speech is updated (e.g., how long user U has been silent, the fraction of time during the call that user U speaks, etc.).
Task 190 checks whether the call has terminated or user U has hung up. If either of these events has occurred, the method of
At task 195, user U is disconnected from the call, and any appropriate action is taken (e.g., alerting security personnel, etc.). After task 195, the method of
At task 210, a peer P is selected from among the call participants for authenticating user V. (Naturally, if there is only one other call participant besides user V, then that person must be peer P.) As will be appreciated by those skilled in the art, there are a variety of possible criteria by which a peer might be selected; several such criteria, along with selection methods based on these criteria, are disclosed in U.S. patent application Ser. No. 11/832,574, which is published as U.S. Patent Application Publication No. 2009/0037985, which is incorporated by reference.
At task 220, peer P is prompted to authenticate user U (e.g., via a text message, via an audible message, etc.). As will be appreciated by those skilled in the art, in some embodiments of the present invention the prompting might specify particular instructions for authenticating user U (e.g., “ask user U a question only he or she would know”, etc.), while in some other embodiments, the prompt might simply ask peer P to indicate whether he thinks user U is who she claims to be.
At task 230, the “verdict” from peer P (e.g., a yes/no answer, a degree of confidence on a numerical scale, etc.) is received. After task 230, execution continues at task 180 of
It is to be understood that the disclosure teaches just one example of the illustrative embodiment and that many variations of the invention can easily be devised by those skilled in the art after reading this disclosure and that the scope of the present invention is to be determined by the following claims.
Number | Name | Date | Kind |
---|---|---|---|
5303285 | Kerihuel et al. | Apr 1994 | A |
5848156 | Murakami | Dec 1998 | A |
6014085 | Patel | Jan 2000 | A |
6253202 | Gilmour | Jun 2001 | B1 |
6282183 | Harris et al. | Aug 2001 | B1 |
6298072 | Koliczew | Oct 2001 | B1 |
6349206 | Reichelt et al. | Feb 2002 | B1 |
6484033 | Murray | Nov 2002 | B2 |
6788772 | Barak et al. | Sep 2004 | B2 |
6859651 | Gabor | Feb 2005 | B2 |
6983278 | Yu et al. | Jan 2006 | B1 |
7092508 | Brown et al. | Aug 2006 | B2 |
7139390 | Brown et al. | Nov 2006 | B2 |
7162256 | Seligmann et al. | Jan 2007 | B2 |
7181620 | Hur | Feb 2007 | B1 |
7221949 | Clough | May 2007 | B2 |
7233997 | Leveridge et al. | Jun 2007 | B1 |
7237024 | Toomey | Jun 2007 | B2 |
7246236 | Stirbu | Jul 2007 | B2 |
7263179 | Sammon et al. | Aug 2007 | B2 |
7293284 | Bartram et al. | Nov 2007 | B1 |
7320143 | Le Pennec et al. | Jan 2008 | B2 |
7334013 | Calinov et al. | Feb 2008 | B1 |
7392048 | Seligmann et al. | Jun 2008 | B2 |
7392375 | Bartram et al. | Jun 2008 | B2 |
7418252 | Erskine et al. | Aug 2008 | B2 |
7421072 | Brotman et al. | Sep 2008 | B2 |
7545942 | Cohen et al. | Jun 2009 | B2 |
7627529 | Bauer et al. | Dec 2009 | B1 |
7688344 | Kimber et al. | Mar 2010 | B2 |
7707293 | Zhang | Apr 2010 | B2 |
7720208 | Kia et al. | May 2010 | B1 |
7721093 | Sundararajan | May 2010 | B2 |
7734912 | Ganesan et al. | Jun 2010 | B2 |
7808906 | Rao et al. | Oct 2010 | B2 |
8108528 | Jones et al. | Jan 2012 | B2 |
20010016038 | Sammon et al. | Aug 2001 | A1 |
20020133716 | Harif | Sep 2002 | A1 |
20020147019 | Uhlik et al. | Oct 2002 | A1 |
20030108186 | Brown et al. | Jun 2003 | A1 |
20030118167 | Sammon et al. | Jun 2003 | A1 |
20030119506 | Singhai et al. | Jun 2003 | A1 |
20030156707 | Brown et al. | Aug 2003 | A1 |
20040035644 | Ford et al. | Feb 2004 | A1 |
20040054885 | Bartram et al. | Mar 2004 | A1 |
20040098588 | Ohba et al. | May 2004 | A1 |
20040122958 | Wardrop | Jun 2004 | A1 |
20040153518 | Seligmann et al. | Aug 2004 | A1 |
20040179660 | Sammon et al. | Sep 2004 | A1 |
20040189441 | Stergiou | Sep 2004 | A1 |
20040202306 | Brotman et al. | Oct 2004 | A1 |
20040214558 | Chang et al. | Oct 2004 | A1 |
20040216039 | Lane et al. | Oct 2004 | A1 |
20040218744 | Nguyen et al. | Nov 2004 | A1 |
20040236771 | Colver et al. | Nov 2004 | A1 |
20050070312 | Seligmann et al. | Mar 2005 | A1 |
20050091172 | King et al. | Apr 2005 | A1 |
20050135305 | Wentink | Jun 2005 | A1 |
20050250482 | Seligmann et al. | Nov 2005 | A1 |
20060004921 | Suess et al. | Jan 2006 | A1 |
20060014532 | Seligmann et al. | Jan 2006 | A1 |
20060030263 | Seligmann et al. | Feb 2006 | A1 |
20060095771 | Appenzeller et al. | May 2006 | A1 |
20060178567 | Goh et al. | Aug 2006 | A1 |
20060224477 | Garcia et al. | Oct 2006 | A1 |
20060256008 | Rosenberg | Nov 2006 | A1 |
20070033397 | Phillips, II et al. | Feb 2007 | A1 |
20070061881 | Eyre | Mar 2007 | A1 |
20070071180 | Kanada | Mar 2007 | A1 |
20070094497 | O'Gorman et al. | Apr 2007 | A1 |
20070112964 | Guedalia et al. | May 2007 | A1 |
20070118735 | Cherrington et al. | May 2007 | A1 |
20070162554 | Branda et al. | Jul 2007 | A1 |
20070171910 | Kumar | Jul 2007 | A1 |
20070214259 | Ahmed et al. | Sep 2007 | A1 |
20070230683 | Brown et al. | Oct 2007 | A1 |
20070265956 | Epstein et al. | Nov 2007 | A1 |
20070283027 | Hoffmann | Dec 2007 | A1 |
20070285504 | Hesse | Dec 2007 | A1 |
20080005095 | Horvitz et al. | Jan 2008 | A1 |
20080010200 | Smith et al. | Jan 2008 | A1 |
20080071761 | Singh et al. | Mar 2008 | A1 |
20080077976 | Schulz | Mar 2008 | A1 |
20080102766 | Schultz | May 2008 | A1 |
20080102790 | Schultz | May 2008 | A1 |
20080120718 | Bentley et al. | May 2008 | A1 |
20080126541 | Rosenberg et al. | May 2008 | A1 |
20080141034 | Bartram et al. | Jun 2008 | A1 |
20080146193 | Bentley et al. | Jun 2008 | A1 |
20080152113 | Chang et al. | Jun 2008 | A1 |
20080195861 | Salomone | Aug 2008 | A1 |
20080235043 | Goulandris et al. | Sep 2008 | A1 |
20080294655 | Picault et al. | Nov 2008 | A1 |
20090037985 | Bentley et al. | Feb 2009 | A1 |
20090125435 | Cohen et al. | May 2009 | A1 |
20090125721 | Numaoka | May 2009 | A1 |
20090131015 | Bentley et al. | May 2009 | A1 |
20090133106 | Bentley et al. | May 2009 | A1 |
20090133117 | Bentley et al. | May 2009 | A1 |
20090193514 | Adams et al. | Jul 2009 | A1 |
Number | Date | Country |
---|---|---|
2006092530 | Apr 2006 | JP |
02085019 | Oct 2002 | WO |
2007012831 | Feb 2007 | WO |
Entry |
---|
IBM. “During-conversation Peer Initiated Identity Challenging and Authentication Algorithm for Instant Messaging Systems.” IP Analysis Software. ip.com, Published, Sep. 25, 2006. Accessed Nov. 12, 2014. <http://ip.com/IPCOM/000140902>. |
Brainard et al., “Fourth-factor authentication: somebody you know,” Association of Computing Machinery Conference on Computer and Communication Security, Oct. 30-Nov. 3, 2006, pp. 168-178, 11 pages. |
Avaya Inc., GB Patent Application No. GB081291.1, Search Report dated Jan. 27, 2009, 3 pages. |
Avaya Inc., JP Patent Application No. 2008-194375, Office Action dated Sep. 28, 2012, 2 pages. |
Number | Date | Country | |
---|---|---|---|
20100064345 A1 | Mar 2010 | US |