Continual peer authentication

Information

  • Patent Grant
  • 8950001
  • Patent Number
    8,950,001
  • Date Filed
    Tuesday, September 9, 2008
    16 years ago
  • Date Issued
    Tuesday, February 3, 2015
    9 years ago
Abstract
A method for orchestrating peer authentication during a call (e.g., a telephone call, a conference call between three or more parties, an instant messaging [IM] chat session, etc.) is disclosed. In particular, a user is first authenticated in order to participate in a call (e.g., via entering a password, etc.), and subsequently during the call the user may be peer authenticated. In accordance with the illustrative embodiment, a user who participates in a call might be prompted to authenticate another user on the call based on particular events or user behavior during the call.
Description
CROSS REFERENCE TO RELATED APPLICATIONS

The underlying concepts, but not necessarily the language, of the following case are incorporated by reference:


U.S. patent application Ser. No. 11/832,574, filed 1 Aug. 2007, which is published as U.S. Patent Application Publication No. 2009/0037985. If there are any contradictions or inconsistencies in language between this application and the case that has been incorporated by reference that might affect the interpretation of the claims in this case, the claims in this case should be interpreted to be consistent with the language in this case.


FIELD OF THE INVENTION

The present invention relates to security in general, and, more particularly, to authentication.


BACKGROUND OF THE INVENTION

Peer authentication is a method by which a first user is authenticated by a second user. (Note that, as in the term peer-to-peer communication, the word “peer” is used generically and has no connotation regarding the professional or social standing of the users.) An example of peer authentication is illustrated by the following familiar scenario: an employee in the lobby of a corporate building realizes that she accidentally left her corporate badge at home, and that therefore she will not be allowed to enter the building proper without some other means of authentication. She therefore approaches the guard in the lobby and tells the guard that she is an employee, but that she doesn't have her badge.


The guard then:

    • asks the employee for her name;
    • looks up the name in a computer database;
    • notes the employee's office number;
    • submits a query to the database to determine who the employee's officemate is; and
    • calls the officemate, asking him to come to the lobby to identify the alleged employee.


The officemate arrives at the lobby, and then either:

    • verifies that the alleged employee is indeed an employee of the company, or
    • tells the guard that he does not recognize the alleged employee.


      In the former case, the guard issues the employee temporary identification and permits both employees to advance past the lobby, while in the latter case, the guard stops the alleged employee from advancing past the lobby, and perhaps takes some additional action (e.g., calls the police, etc.).


In a variation of the above scenario, the guard, rather than asking the officemate to come to the lobby, might ask the officemate to talk to the alleged employee over the phone. The officemate talks to the alleged employee over the phone, and determines whether the alleged employee is in fact who she claims to be, based on the telephone conversation (e.g., based on her voice, based on her answers to one or more questions, etc.). The officemate then informs the guard whether the alleged employee should be allowed to advance past the lobby.


SUMMARY OF THE INVENTION

The present invention provides a mechanism for orchestrating peer authentication during an ongoing electronic communication session, which we will term a call (e.g., a telephone call, a conference call between three or more parties, an instant messaging [IM] chat session, etc.). The mechanism is particularly useful in detecting malicious behavior that might occur during a conference call. For example, during an important business conference call—say, concerning the merger of two corporations—one of the participants might have to sneak out of his office momentarily for a bathroom break, during which a malicious user could come in to the office and overhear confidential information, or even impersonate the other person.


In accordance with the illustrative embodiment of the present invention, a user is first authenticated in order to participate in a call (e.g., via entering a password, etc.), and subsequently during the call the user may be peer authenticated. In particular, a user who participates in a call might be prompted to authenticate another user on the call based on particular events or user behavior during the call. For example, if a first user is silent for a given length of time during the call, a second user on the call (i.e., a “peer”) might be prompted to authenticate the first user (the theory being that, perhaps, another person has maliciously taken the first user's place and is passively listening to the call). As another example, if a first user is participating in a call via a wireless telecommunications terminal, a second user on the call might be prompted to authenticate the first user if the first user has entered a public area in which there is a greater likelihood of malicious behavior.


In accordance with the present invention, a peer might be prompted to authenticate a user in a variety of ways. In some embodiments, for example, a text message might appear on the display of the peer's terminal, asking the peer whether or not a particular user's voice sounds correct. Alternatively, in some other embodiments of the present invention, a text message might prompt the peer to (1) ask a particular user a question that only the actual user would know the answer to, and (2) enter a number between 1 and 5 indicating the peer's confidence in the user's identity.


When there are three or more users participating in a call, then in some embodiments of the present invention the selection of the peer might occur randomly, while in some other embodiments the selection might be based on a variety of criteria (e.g., based on an indication of how well the users know each other, as disclosed in U.S. patent application Ser. No. 11/832,574, which is published as U.S. Patent Application Publication No. 2009/0037985, incorporated by reference, etc.)


The illustrative embodiment comprises: presenting an authentication challenge to a first user who wishes to participate in a call; admitting the first user to the call when the authentication challenge is met; and generating during the call, after the admission of the first user, a signal that prompts a second user participating in the call to authenticate the first user.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 depicts a flowchart of salient tasks of the method of the present invention, in accordance with the illustrative embodiment of the present invention.



FIG. 2 depicts a detailed flowchart for task 170, as shown in FIG. 1, in accordance with the illustrative embodiment of the present invention.





DETAILED DESCRIPTION

For the purposes of this specification, the term “peer” is defined as a user. In accordance with the illustrative embodiment, two users are considered peers if they participate in a particular call; the term does not impose any constraints on the relative ranks, status, etc. of the users.



FIG. 1 depicts a flowchart of salient tasks of the method of the present invention, in accordance with the illustrative embodiment of the present invention. As will be appreciated by those skilled in the art, in some embodiments of the present invention one or more tasks of FIG. 1 might be performed by a telecommunications terminal, while in some other embodiments, one or more tasks of FIG. 1 might be performed by another telecommunications or data-processing system (e.g., a teleconferencing system, a private branch exchange [PBX], a gateway, a server, etc.), while in still some other embodiments, the tasks of FIG. 1 might be performed by different entities (for example, task 110 performed by the user of a terminal, task 120 performed by the terminal itself, task 130 performed by a gateway, and so forth). In any case, it will be clear to those skilled in the art, after reading this disclosure, how to make and use such embodiments of the present invention. Moreover, it will be clear to those skilled in the art, after reading this disclosure, which tasks depicted in FIG. 1 can be performed simultaneously or in a different order than that depicted.


At task 110, user U attempts to participate in a call via his or her telecommunications terminal T, in well-known fashion.


At task 120, user U is presented with one or more authentication challenges (e.g., a username/password challenge, etc.), in well-known fashion.


At task 130, the method branches based on whether the authentication challenge(s) presented at task 120 were met by user U. If so, execution continues at task 140, otherwise, the method of FIG. 1 terminates.


At task 140, user U is admitted to the call, and the current time and geo-location of terminal Tare recorded, in well-known fashion.


Task 150 determines whether user U should be peer authenticated based on a variety of criteria, including:

    • the current time,
    • the time at which user U was admitted to call,
    • the time of the most recent peer authentication of user U (if any),
    • the current geo-location of terminal T,
    • the geo-location history of terminal T during the call,
    • the current geo-locations of other call participants' terminals,
    • one or more rules concerning user U's speech, and
    • one or more rules concerning terminal geo-locations and history.


Some examples of rules concerning user U's speech might include:

    • peer authenticate when user U speaks for the first time
    • peer authenticate when user U has been silent for at least N seconds, where N is a positive number
    • peer authenticate when user U speaks for the first time after being silent for at least M seconds, where M is a positive number
    • peer authenticate when user U speaks for less than a given fraction f of the time over a given time interval


Some examples of rules concerning terminal geo-locations and history might include:

    • peer authenticate when terminal T moves to a new geo-location
    • peer authenticate every K minutes when terminal T is in an area that is deemed to be particularly vulnerable to security attacks, where K is a positive number
    • peer authenticate when terminal T has been in three or more different areas in the last L minutes, where L is a positive number
    • peer authenticate when terminal T has moved from being within 5 meters of another call participant to being at least 100 meters away from any other call participant


At task 160, the method branches based on whether it was determined at task 150 to peer authenticate user U. If so, execution continues at task 170, otherwise, execution proceeds to task 185.


At task 170, peer authentication of user U is performed, as described in detail below and with respect to FIG. 2.


At task 180, the method branches based on whether the verdict received by the peer is deemed to be “satisfactory”. As described below and with respect to task 230 of FIG. 2, in some embodiments of the present invention the peer might provide a simple yes/no verdict, while in some other embodiments, the peer might provide a degree of confidence (e.g., an integer between 1 and 5 inclusive, etc.). As will be appreciated by those skilled in the art, in the latter case, the notion of whether a verdict is judged satisfactory is an implementation-specific issue that might be determined by a systems administrator, a Chief Information Officer of an enterprise, a programmer, etc.


If the verdict is satisfactory, execution continues at task 185, otherwise execution continues at task 195.


At task 185, the current geo-locations of the call participants' terminals are recorded, and information about user U's speech is updated (e.g., how long user U has been silent, the fraction of time during the call that user U speaks, etc.).


Task 190 checks whether the call has terminated or user U has hung up. If either of these events has occurred, the method of FIG. 1 terminates, otherwise execution continues back at task 150.


At task 195, user U is disconnected from the call, and any appropriate action is taken (e.g., alerting security personnel, etc.). After task 195, the method of FIG. 1 terminates.



FIG. 2 depicts a detailed flowchart for task 170, in accordance with the illustrative embodiment of the present invention.


At task 210, a peer P is selected from among the call participants for authenticating user V. (Naturally, if there is only one other call participant besides user V, then that person must be peer P.) As will be appreciated by those skilled in the art, there are a variety of possible criteria by which a peer might be selected; several such criteria, along with selection methods based on these criteria, are disclosed in U.S. patent application Ser. No. 11/832,574, which is published as U.S. Patent Application Publication No. 2009/0037985, which is incorporated by reference.


At task 220, peer P is prompted to authenticate user U (e.g., via a text message, via an audible message, etc.). As will be appreciated by those skilled in the art, in some embodiments of the present invention the prompting might specify particular instructions for authenticating user U (e.g., “ask user U a question only he or she would know”, etc.), while in some other embodiments, the prompt might simply ask peer P to indicate whether he thinks user U is who she claims to be.


At task 230, the “verdict” from peer P (e.g., a yes/no answer, a degree of confidence on a numerical scale, etc.) is received. After task 230, execution continues at task 180 of FIG. 1.


It is to be understood that the disclosure teaches just one example of the illustrative embodiment and that many variations of the invention can easily be devised by those skilled in the art after reading this disclosure and that the scope of the present invention is to be determined by the following claims.

Claims
  • 1. A method comprising: presenting by a data-processing system an authentication challenge to a first user who wishes to participate in a call;admitting by said data-processing system said first user to said call when said authentication challenge is met; andgenerating by said data-processing system during said call, after the admission of said first user, a signal that prompts a second user participating in said call to authenticate said first user, wherein the second user consists of a human.
  • 2. The method of claim 1 wherein the generation of said signal is in response to said first user speaking for the first time during said call.
  • 3. The method of claim 1 wherein the generation of said signal is in response to said first user speaking after being silent for at least N seconds, wherein N is a positive number.
  • 4. The method of claim 1 wherein the generation of said signal is in response to said first user being silent for at least N seconds, wherein N is a positive number.
  • 5. The method of claim 1 wherein the generation of said signal is in response to said first user speaking for less than a fraction f of time over a time interval.
  • 6. The method of claim 1 wherein the generation of said signal is in response to said first user speaking for no more than N seconds contiguously over a time interval, wherein N is a positive number.
  • 7. The method of claim 1 wherein said first user participates in said call via a wireless telecommunications terminal, and wherein the generation of said signal is in response to a change in location of said wireless telecommunications terminal.
  • 8. The method of claim 1 further comprising selecting by said data-processing system said second user from a plurality of users who are participating in said call.
  • 9. A method comprising: presenting by a data-processing system an authentication challenge to a first user who wishes to participate in a call;admitting by said data-processing system said first user to said call at time t1 when said authentication challenge is met; andgenerating by said data-processing system during said call, at time t2, a signal that prompts a second user participating in said call to authenticate said first user, wherein t2>t1, and wherein the second user consists of a human.
  • 10. The method of claim 9 wherein said time t2 is based on when said first user speaks during said call.
  • 11. The method of claim 9 wherein said first user participates in said call via a wireless telecommunications terminal, and wherein said time t2 is based on at least one location of said wireless telecommunications terminal during time interval [t1, t2].
  • 12. The method of claim 9 wherein said first user participates in said call via a wireless telecommunications terminal, and wherein the magnitude of t2-t1 is based on at least one location of said wireless telecommunications terminal during time interval [t1, t2].
  • 13. The method of claim 9 wherein the magnitude of t2-t1 is based on a metric of said first user's speaking during said call.
  • 14. The method of claim 13 wherein the magnitude of t2-t1 is based on the length of a time interval over which said first user is silent.
  • 15. The method of claim 13 wherein the magnitude of t2-t1 is based on the fraction of time interval [t1, t2] over which said first user is silent.
  • 16. The method of claim 9 further comprising selecting by said data-processing system said second user from a plurality of users who are participating in said call.
  • 17. A method comprising: presenting by a data-processing system an authentication challenge to a first user who wishes to participate in a call;admitting by said data-processing system said first user to said call at time t1 when said authentication challenge is met;generating by said data-processing system during said call, at time t2, a first signal that prompts a second user participating in said call to authenticate said first user, wherein the second user consists of a human; andgenerating by said data-processing system during said call, at time t3, a second signal that prompts a third user participating in said call to authenticate said first user, wherein t3>t2>t1.
  • 18. The method of claim 17 wherein said first user participates in said call via a wireless telecommunications terminal, and wherein the magnitude of t3-t2 is based on at least one location of said wireless telecommunications terminal during time interval [t2, t3].
  • 19. The method of claim 17 wherein the magnitude of t3-t2 is based on a metric of said first user's speaking during time interval [t2, t3].
  • 20. The method of claim 17 further comprising: selecting by said data-processing system said second user from a plurality of users who are participating in said call at said time t2; andselecting by said data-processing system said third user from a plurality of users who are participating in said call at said time t3.
US Referenced Citations (100)
Number Name Date Kind
5303285 Kerihuel et al. Apr 1994 A
5848156 Murakami Dec 1998 A
6014085 Patel Jan 2000 A
6253202 Gilmour Jun 2001 B1
6282183 Harris et al. Aug 2001 B1
6298072 Koliczew Oct 2001 B1
6349206 Reichelt et al. Feb 2002 B1
6484033 Murray Nov 2002 B2
6788772 Barak et al. Sep 2004 B2
6859651 Gabor Feb 2005 B2
6983278 Yu et al. Jan 2006 B1
7092508 Brown et al. Aug 2006 B2
7139390 Brown et al. Nov 2006 B2
7162256 Seligmann et al. Jan 2007 B2
7181620 Hur Feb 2007 B1
7221949 Clough May 2007 B2
7233997 Leveridge et al. Jun 2007 B1
7237024 Toomey Jun 2007 B2
7246236 Stirbu Jul 2007 B2
7263179 Sammon et al. Aug 2007 B2
7293284 Bartram et al. Nov 2007 B1
7320143 Le Pennec et al. Jan 2008 B2
7334013 Calinov et al. Feb 2008 B1
7392048 Seligmann et al. Jun 2008 B2
7392375 Bartram et al. Jun 2008 B2
7418252 Erskine et al. Aug 2008 B2
7421072 Brotman et al. Sep 2008 B2
7545942 Cohen et al. Jun 2009 B2
7627529 Bauer et al. Dec 2009 B1
7688344 Kimber et al. Mar 2010 B2
7707293 Zhang Apr 2010 B2
7720208 Kia et al. May 2010 B1
7721093 Sundararajan May 2010 B2
7734912 Ganesan et al. Jun 2010 B2
7808906 Rao et al. Oct 2010 B2
8108528 Jones et al. Jan 2012 B2
20010016038 Sammon et al. Aug 2001 A1
20020133716 Harif Sep 2002 A1
20020147019 Uhlik et al. Oct 2002 A1
20030108186 Brown et al. Jun 2003 A1
20030118167 Sammon et al. Jun 2003 A1
20030119506 Singhai et al. Jun 2003 A1
20030156707 Brown et al. Aug 2003 A1
20040035644 Ford et al. Feb 2004 A1
20040054885 Bartram et al. Mar 2004 A1
20040098588 Ohba et al. May 2004 A1
20040122958 Wardrop Jun 2004 A1
20040153518 Seligmann et al. Aug 2004 A1
20040179660 Sammon et al. Sep 2004 A1
20040189441 Stergiou Sep 2004 A1
20040202306 Brotman et al. Oct 2004 A1
20040214558 Chang et al. Oct 2004 A1
20040216039 Lane et al. Oct 2004 A1
20040218744 Nguyen et al. Nov 2004 A1
20040236771 Colver et al. Nov 2004 A1
20050070312 Seligmann et al. Mar 2005 A1
20050091172 King et al. Apr 2005 A1
20050135305 Wentink Jun 2005 A1
20050250482 Seligmann et al. Nov 2005 A1
20060004921 Suess et al. Jan 2006 A1
20060014532 Seligmann et al. Jan 2006 A1
20060030263 Seligmann et al. Feb 2006 A1
20060095771 Appenzeller et al. May 2006 A1
20060178567 Goh et al. Aug 2006 A1
20060224477 Garcia et al. Oct 2006 A1
20060256008 Rosenberg Nov 2006 A1
20070033397 Phillips, II et al. Feb 2007 A1
20070061881 Eyre Mar 2007 A1
20070071180 Kanada Mar 2007 A1
20070094497 O'Gorman et al. Apr 2007 A1
20070112964 Guedalia et al. May 2007 A1
20070118735 Cherrington et al. May 2007 A1
20070162554 Branda et al. Jul 2007 A1
20070171910 Kumar Jul 2007 A1
20070214259 Ahmed et al. Sep 2007 A1
20070230683 Brown et al. Oct 2007 A1
20070265956 Epstein et al. Nov 2007 A1
20070283027 Hoffmann Dec 2007 A1
20070285504 Hesse Dec 2007 A1
20080005095 Horvitz et al. Jan 2008 A1
20080010200 Smith et al. Jan 2008 A1
20080071761 Singh et al. Mar 2008 A1
20080077976 Schulz Mar 2008 A1
20080102766 Schultz May 2008 A1
20080102790 Schultz May 2008 A1
20080120718 Bentley et al. May 2008 A1
20080126541 Rosenberg et al. May 2008 A1
20080141034 Bartram et al. Jun 2008 A1
20080146193 Bentley et al. Jun 2008 A1
20080152113 Chang et al. Jun 2008 A1
20080195861 Salomone Aug 2008 A1
20080235043 Goulandris et al. Sep 2008 A1
20080294655 Picault et al. Nov 2008 A1
20090037985 Bentley et al. Feb 2009 A1
20090125435 Cohen et al. May 2009 A1
20090125721 Numaoka May 2009 A1
20090131015 Bentley et al. May 2009 A1
20090133106 Bentley et al. May 2009 A1
20090133117 Bentley et al. May 2009 A1
20090193514 Adams et al. Jul 2009 A1
Foreign Referenced Citations (3)
Number Date Country
2006092530 Apr 2006 JP
02085019 Oct 2002 WO
2007012831 Feb 2007 WO
Non-Patent Literature Citations (4)
Entry
IBM. “During-conversation Peer Initiated Identity Challenging and Authentication Algorithm for Instant Messaging Systems.” IP Analysis Software. ip.com, Published, Sep. 25, 2006. Accessed Nov. 12, 2014. <http://ip.com/IPCOM/000140902>.
Brainard et al., “Fourth-factor authentication: somebody you know,” Association of Computing Machinery Conference on Computer and Communication Security, Oct. 30-Nov. 3, 2006, pp. 168-178, 11 pages.
Avaya Inc., GB Patent Application No. GB081291.1, Search Report dated Jan. 27, 2009, 3 pages.
Avaya Inc., JP Patent Application No. 2008-194375, Office Action dated Sep. 28, 2012, 2 pages.
Related Publications (1)
Number Date Country
20100064345 A1 Mar 2010 US