Touch screens are known to provide a means for displaying graphics and text in electronic devices and for entering commands to control the device or to perform various other functions to carry out the operation of the device. Touch screens are now becoming increasingly popular for use as displays in mobile telephones, particularly cellular telephones having integrated personal digital assistant (PDA) or palm-held personal computer (PPC) features. The touch screens are generally designed to operate and respond to a finger touch, a stylus tap or movement on the touch screen surface. Touching or tapping a specific point on the touch screen display activates a virtual button, feature or function found or shown at that location on the touch screen display. Typical telephone features which may be operated by touching or tapping the touch screen display include entering a telephone number, for example, by tapping or touching virtual keys of a virtual keyboard shown on the display, making a call or ending a call, navigating through an address book, increasing or decreasing the listening volume, starting or ending a call, accessing speed dialing, and locking/unlocking the telephone.
Currently, mobile phones provide security against unauthorized use by requiring a user to enter a password such as a text string or a personal identification number (PIN) using a keypad on the mobile telephone. The mobile telephone is locked against use until the user enters the correct password. In order to maximize security, the mobile telephone may require the user to enter the password every time the mobile telephone is used. In practice, for the sake of convenience and ease of remembering, users often select weak passwords (e.g., passwords that are predictable, such as a birth date). The more predictable a password may be, the easier it is for an unauthorized user to determine the password. However, the more strong a password is (e.g., a longer password), the more difficult it is for an authorized user to remember the password. Existing systems for locking computing devices fail to encourage users to select strong passwords while also providing an engaging unlock experience.
Embodiments of the disclosure present an image to a user on a computing device. The image serves as a template for locking and unlocking the computing device and includes a plurality of portions that are defined, and thereafter, identified to the user as selectable portions. A user's selection of the portions of the image is received and stored as an unlock code for subsequently unlocking the computing device.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Corresponding reference characters indicate corresponding parts throughout the drawings.
Referring to the figures, embodiments of the disclosure described herein relate to lock and unlock capabilities of a computing device. For example, aspects of the disclosure include a touch screen 304 on a mobile computing device 102. In embodiments of the disclosure, functions of mobile computing device 102 are selectable by tapping or touching touch screen 304 by means of, for example, a stylus or a user's finger. Embodiments of the disclosure use an image, such as a photograph as a basis for a “key combination” in contrast to conventional systems and methods that utilize numeric or alphanumeric personal identification numbers (PINs) (e.g., four or six digit PINs). For example, a user selects portions or zones of an image in a pre-defined sequence. This sequence is stored as an unlock code. Thus, the unlock function on the mobile computing device 102 may be personalized thereby enhancing the user unlock experience while maintaining or increasing the security of the mobile computing device 102. In some embodiments, it is more difficult for an unauthorized user to gain access to the mobile computing device 102 locked in accordance with the disclosure than it would be if the mobile computing device 102 is locked in accordance with conventional lock functionality such as, for example, a 10-digit PIN.
The user may select an image to serve as a template for unlocking the mobile computing device 102. After the user selects an image to be displayed on the mobile computing device 102, the user selects portions or zones of the image in a specified sequence. The portions or zones, along with the specified sequence, are stored as an unlock code. Thus, when the user wants to unlock the mobile computing device 102, the selected image is displayed on the mobile computing device 102. The user selects the portions or zones in the image (e.g., contacts the portions or zones via the touch screen 304) in the selected sequence to unlock the mobile computing device 102. As described herein, utilizing the image as a reference for the unlock code not only provides a user with an interactive, fun, and personalized method of locking and/or unlocking the mobile computing device 102, but also provides security based at least on the quantity of portions or zones within an image that may be selected. In some embodiments, the quantity of portions within the image may be defined by identified contours or edges of objects within the image. For example, the touch screen 304 may have a predefined grid allowing selection of an image within one of the portions of the grid (see
In addition, unlike conventional systems and methods, the user may lock the mobile computing device 102 while an application is in progress without disrupting the execution of the running application. For example, if the user is listening to music on the mobile computing device 102, the user locks the mobile computing device 102 while continuing to listen to the music. This allows the user to, for example, listen to uninterrupted music while exercising without accidentally disrupting play of the music because the user does not have to worry about accidental initiation of other applications from unintentional touching of the device. In a further example, a user may initiate a global positioning system (GPS) application to provide the user with directions to a particular location. In this example, the user may lock the mobile computing device 102 while the GPS application is providing directions. Thus, the user does not have to worry about accidental initiation of other applications from unintentional touching of the device while the directions are being provided by the GPS application.
While some embodiments of the disclosure are illustrated and described herein with reference to mobile computing device 102 such as a mobile telephone, aspects of the disclosure are operable with any device that performs the functionality illustrated and described herein, or its equivalent. For example, embodiments of the disclosure are operable with netbooks, desktop computing devices, laptop computers, gaming consoles, portable audio players, and other devices.
Aspects of the disclosure described herein provide, at least, an intuitive, enjoyable, and memorable lock functionality for any computing device. In embodiments, the lock screen appears because of an explicit user command to lock the device, or because of powering on the device.
Referring again to
Memory area 104, or other computer-readable medium or media, stores, for example, user-defined or default images and unlock codes. However, while images and unlock codes are stored in the memory area 104 on mobile computing device 102, one or more of the images and unlock codes may be stored remotely from mobile computing device 102. For example, images and unlock codes may be stored in a cloud service, a database, or other memory area accessible by mobile computing device 102.
Memory area 104 further stores one or more computer-executable components. Exemplary components include, but are not limited to, a presentation component 110, a detection component 114, an authorization component 116, and a lock component 118. While the components are shown to be stored in memory area 104, the components may be stored and executed from a memory area remote from mobile computing device 102. For example, the components may be stored by a cloud service, and the output of the execution of the components may be provided to mobile computing device 102. Such embodiments reduce the computational and storage burden on mobile computing device 102. The components illustrated in
Processor 108 executes computer-executable instructions for implementing aspects of the disclosure. In some embodiments, the processor 108 is transformed into a special purpose microprocessor by executing computer-executable instructions or by otherwise being programmed. For example, the processor 108 is programmed with instructions such as illustrated in
Referring next to
At 206, the plurality of portions are identified or presented to a user via the display 106. In an embodiment, the defined portions may be highlighted, for example, via a color, shading, or circled, to distinguish the defined portions of the image from the non-defined portions. In embodiments, the defined portions of an image are the only selectable areas of the image. In a further embodiment, the defined portions of an image are merely recommended selectable areas of the image.
At 208, a selection of the portions is received from the user via display 106. For example, a user selects portions/zones within the image in a specified sequence. The user may use more than one image as well as a movement of the mobile computing device 102 as part of the unlock code. At 210, the received selections are stored in the memory area 104 as the unlock code. The unlock code may be subsequently modified by the user. Further, other unlock codes may be created that represent additional or alternative accepted codes for unlocking the device. For example, there may be different levels of unlock functionality. In some embodiments, one unlock code may expose a portion of the functionality of the mobile computing device 102, while another unlock code exposes the full functionality of the mobile computing device 102. In a particular example, one unlock code may enable a user to make calls or send instant messages, but prevent access to corporate emails. Another unlock code enables access to corporate email in addition to the rest of the functionality of the device.
Referring next to
Referring next to
Referring next to
Detection component 114 detects the user input at 504 and receives the portion selections and sequence associated therewith or other operations from the user as input. The portion selections and sequence represent an attempt by the user to unlock the mobile computing device 102. In some embodiments, detection component 114 receives data from one or more interfaces or modules that receive the user input from the user. The interfaces may include, for example, touch screen 304 or other display generally such as display 106, or one or more accelerometers (e.g., one accelerometer for each axis of motion). For example, in some embodiments, the detection component 114 receives data from the accelerometers that corresponds to user-induced movement of the mobile computing device 102. The user-induced movement may represent a sequence of movements along at least one axis of rotation of the mobile computing device 102. The movements include, for example, rotating the mobile computing device 102 a number of degrees, shaking the mobile computing device 102 for a predetermined amount of time, or performing a sequence of movements of the mobile computing device 102. For compass-equipped mobile computing devices, the user may point the mobile computing device in a particular direction (e.g., north) or sequence of directions. In some embodiments, the movement or sequence of movements may correspond to part of the unlock code or constitute the entire unlock code.
In further embodiments, a movement or sequence of movements of the mobile computing device 102 may be combined with, or correspond to, a selection of a portion of a presented image. For example, an image of a combination lock having one or more reels is displayed to the user on the display 106. To unlock the mobile computing device 102, the user selects one of the reels (e.g., touches the display 106) and flicks the mobile computing device 102 or otherwise moves the mobile computing device 102 to rotate through digits on the selected reel of the combination lock. For example, when the mobile computing device 102 is rotated away from the user (e.g., at 180 degrees), the selected reel is similarly rotated (e.g., 180 degrees, or more or less) to simulate a manual rotation of the combination lock. The user performs similar operations to enter a combination to unlock the lock.
In some embodiments, the user selects and moves portion of the lock screen image to a different location within the image. For example, the image may include a plurality of scrambled pieces that the user unscrambles, or sorts, to unlock the mobile computing device 102. In such embodiments, not only does the user select particular portions of the image, but the user also places the portions in a particular location.
Authorization component 116 compares the data or operations received by the detection component 114 to a predefined sequence of the portions of the image at 506 to determine whether the user input corresponds to the unlock code. If the received operations match the stored unlock function, the lock component 118 unlocks the mobile computing device 102 at 508. Otherwise, the lock component 118 displays an error message to the user indicating that the input did not match the stored unlock code.
Referring next to
In other embodiments, the portions or zones are defined by other areas of the image. For example, the selectable portions may correspond to windows in an image of a building, or faces in a family portrait. Aspects of the disclosure contemplate numerous other means for defining the selectable portions based on the image.
Additional Examples
Aspects of the disclosure may also be used as a human interactive proof (HIP) or other challenge meant to be easily solved by a human yet too difficult to be easily solved by a program such as a bot. In such embodiments, for example, an image is displayed on a computing device during creation of a user account. Instructions are displayed such as “Select the following portions of the image with the mouse or stylus in sequence: 1. The red ball, 2. The brown dog, 3. The boat, and 4. The red shutters on the house.” The portions and/or sequence in the instructions may be chosen randomly by the computing device, and the instructions may be displayed in multiple languages to internationalize the challenge. If the user selects the portions of the image in the requested sequence, the challenge is satisfied and the user may continue to create the user account. If the user fails the challenge, the user is prevented from creating the user account. Embodiments such as those described herein represent challenges that are applicable in multiple countries in contrast to existing HIP challenges, for example, that visually distort an English word and request textual input of the word.
Exemplary Operating Environment
A computer or computing device such as described herein has one or more processors or processing units, system memory, and some form of computer readable media. By way of example and not limitation, computer readable media comprise computer storage media and communication media. Computer storage media include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Communication media typically embody computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and include any information delivery media. Combinations of any of the above are also included within the scope of computer readable media.
The computer may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer. Although described in connection with an exemplary computing system environment, embodiments of the disclosure are operational with numerous other general purpose or special purpose computing system environments or configurations. The computing system environment is not intended to suggest any limitation as to the scope of use or functionality of any aspect of the disclosure. Moreover, the computing system environment should not be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with aspects of the disclosure include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, mobile telephones, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
Embodiments of the disclosure may be described in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices. The computer-executable instructions may be organized into one or more computer-executable components or modules. Generally, program modules include, but are not limited to, routines, programs, objects, components, and data structures that perform particular tasks or implement particular abstract data types. Aspects of the disclosure may be implemented with any number and organization of such components or modules. For example, aspects of the disclosure are not limited to the specific computer-executable instructions or the specific components or modules illustrated in the figures and described herein. Other embodiments of the disclosure may include different computer-executable instructions or components having more or less functionality than illustrated and described herein. Aspects of the disclosure may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The order of execution or performance of the operations in embodiments of the disclosure illustrated and described herein is not essential, unless otherwise specified. That is, the operations may be performed in any order, unless otherwise specified, and embodiments of the disclosure may include additional or fewer operations than those disclosed herein. For example, it is contemplated that executing or performing a particular operation before, contemporaneously with, or after another operation is within the scope of aspects of the disclosure.
When introducing elements of aspects of the disclosure or the embodiments thereof, the articles “a,” “an,” “the,” and “said” are intended to mean that there are one or more of the elements. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements.
Having described aspects of the disclosure in detail, it will be apparent that modifications and variations are possible without departing from the scope of aspects of the disclosure as defined in the appended claims. As various changes could be made in the above constructions, products, and methods without departing from the scope of aspects of the disclosure, it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.
This application is a continuation of and claims priority to U.S. patent application Ser. No. 13/890,019 filed May 8, 2013, which is a divisional of and claims priority to U.S. patent application Ser. No. 12/485,952 filed Jun. 17, 2009 the disclosures of which are incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4566128 | Araki | Jan 1986 | A |
5224179 | Denker et al. | Jun 1993 | A |
5465084 | Cottrell | Nov 1995 | A |
5485531 | Ichinohe et al. | Jan 1996 | A |
5539840 | Krtolica et al. | Jul 1996 | A |
5559961 | Blonder | Sep 1996 | A |
5608387 | Davies | Mar 1997 | A |
5778069 | Thomlinson et al. | Jul 1998 | A |
6075905 | Herman | Jun 2000 | A |
6118872 | Kashima et al. | Sep 2000 | A |
6185316 | Buffman | Feb 2001 | B1 |
6209104 | Jalili | Mar 2001 | B1 |
6278453 | Bodnar | Aug 2001 | B1 |
6393305 | Ulvinen et al. | May 2002 | B1 |
6516092 | Bachelder et al. | Feb 2003 | B1 |
6718471 | Kashima | Apr 2004 | B1 |
6720860 | Narayanaswami | Apr 2004 | B1 |
6788304 | Hart et al. | Sep 2004 | B1 |
6868190 | Morton | Mar 2005 | B1 |
6934860 | Goldstein | Aug 2005 | B1 |
6948068 | Lawandy et al. | Sep 2005 | B2 |
6958759 | Safadi et al. | Oct 2005 | B2 |
6959394 | Brickell et al. | Oct 2005 | B1 |
7219368 | Juels et al. | May 2007 | B2 |
7240367 | Park | Jul 2007 | B2 |
7243239 | Kirovski et al. | Jul 2007 | B2 |
7257241 | Lo | Aug 2007 | B2 |
7279646 | Xu | Oct 2007 | B2 |
7383570 | Pinkas et al. | Jun 2008 | B2 |
7536556 | Federova et al. | May 2009 | B2 |
7593000 | Chin | Sep 2009 | B1 |
7657849 | Chaudhri et al. | Feb 2010 | B2 |
7734930 | Kirovski et al. | Jun 2010 | B2 |
7743256 | Yang | Jun 2010 | B2 |
7793108 | Narayanaswami et al. | Sep 2010 | B2 |
7831294 | Viswanathan | Nov 2010 | B2 |
7873382 | Rydgren et al. | Jan 2011 | B2 |
7941834 | Beck et al. | May 2011 | B2 |
7953983 | Holt et al. | May 2011 | B2 |
7970240 | Chao et al. | Jun 2011 | B1 |
8024775 | Xu et al. | Sep 2011 | B2 |
8286102 | Wilensky | Oct 2012 | B1 |
8458485 | Bandyopadhyay et al. | Jun 2013 | B2 |
8504842 | Meacham | Aug 2013 | B1 |
RE44725 | Holt et al. | Jan 2014 | E |
8650636 | Johnson et al. | Feb 2014 | B2 |
8782775 | Fadell | Jul 2014 | B2 |
8910253 | Johnson et al. | Dec 2014 | B2 |
9355239 | Bandyopadhyay et al. | May 2016 | B2 |
RE46301 | Holt et al. | Feb 2017 | E |
20010037468 | Gaddis | Nov 2001 | A1 |
20010044906 | Kanevsky et al. | Nov 2001 | A1 |
20020029341 | Juels et al. | Mar 2002 | A1 |
20020141643 | Jaeger | Oct 2002 | A1 |
20030054800 | Miyashita | Mar 2003 | A1 |
20030086609 | Gangadhar | May 2003 | A1 |
20030093699 | Banning et al. | May 2003 | A1 |
20030128396 | Fan | Jul 2003 | A1 |
20030179913 | Murase et al. | Sep 2003 | A1 |
20040010722 | Ha | Jan 2004 | A1 |
20040034801 | Jaeger | Feb 2004 | A1 |
20040085351 | Tokkonen | May 2004 | A1 |
20040155991 | Lowles et al. | Aug 2004 | A1 |
20040169638 | Kaplan | Sep 2004 | A1 |
20040193882 | Singerle | Sep 2004 | A1 |
20040230843 | Jansen | Nov 2004 | A1 |
20040260955 | Mantyla | Dec 2004 | A1 |
20050141747 | Shi et al. | Jun 2005 | A1 |
20050210417 | Marvit et al. | Sep 2005 | A1 |
20050210418 | Marvit et al. | Sep 2005 | A1 |
20060010400 | Dehlin et al. | Jan 2006 | A1 |
20060174339 | Tao | Aug 2006 | A1 |
20060206717 | Holt et al. | Sep 2006 | A1 |
20060206918 | McLean | Sep 2006 | A1 |
20070016958 | Bodepudi | Jan 2007 | A1 |
20070047772 | Matey et al. | Mar 2007 | A1 |
20070071285 | Kontsevich | Mar 2007 | A1 |
20070115091 | Bandaru | May 2007 | A1 |
20070150842 | Chaudhri et al. | Jun 2007 | A1 |
20070277224 | Osborn | Nov 2007 | A1 |
20080230598 | Bodin | Sep 2008 | A1 |
20080244013 | Kropivny | Oct 2008 | A1 |
20080244700 | Osborn et al. | Oct 2008 | A1 |
20080263361 | Dutta et al. | Oct 2008 | A1 |
20090038006 | Traenkenschuh et al. | Feb 2009 | A1 |
20090046929 | De Leon | Feb 2009 | A1 |
20090055910 | Lee | Feb 2009 | A1 |
20090083847 | Fadell et al. | Mar 2009 | A1 |
20090085936 | Chen et al. | Apr 2009 | A1 |
20090094247 | Fredlund et al. | Apr 2009 | A1 |
20090138725 | Madhvanath et al. | May 2009 | A1 |
20090172810 | Won et al. | Jul 2009 | A1 |
20090199295 | Shih | Aug 2009 | A1 |
20090210939 | Xu et al. | Aug 2009 | A1 |
20090244013 | Eldershaw | Oct 2009 | A1 |
20090245512 | Masui | Oct 2009 | A1 |
20090259855 | de Cesare et al. | Oct 2009 | A1 |
20090313693 | Rogers | Dec 2009 | A1 |
20100013762 | Zontrop et al. | Jan 2010 | A1 |
20100031200 | Chen | Feb 2010 | A1 |
20100043062 | Alexander et al. | Feb 2010 | A1 |
20100128002 | Stacy et al. | May 2010 | A1 |
20100131294 | Venon et al. | May 2010 | A1 |
20100156843 | Paleczny et al. | Jun 2010 | A1 |
20100180336 | Jones | Jul 2010 | A1 |
20100186074 | Stavrou et al. | Jul 2010 | A1 |
20100211551 | Ryu | Aug 2010 | A1 |
20100223276 | Al-Shameri et al. | Sep 2010 | A1 |
20100250937 | Blomquist et al. | Sep 2010 | A1 |
20100322485 | Riddiford | Dec 2010 | A1 |
20100325721 | Bandyopadhyay et al. | Dec 2010 | A1 |
20100328201 | Marvit et al. | Dec 2010 | A1 |
20110016405 | Grob et al. | Jan 2011 | A1 |
20110031139 | Macor | Feb 2011 | A1 |
20110156867 | Carrizo et al. | Jun 2011 | A1 |
20110197259 | Thibadeau et al. | Aug 2011 | A1 |
20110247067 | Hirose | Oct 2011 | A1 |
20120123920 | Fraser et al. | May 2012 | A1 |
20120126940 | Coggill | May 2012 | A1 |
20120166944 | Cotterill | Jun 2012 | A1 |
20120304284 | Johnson et al. | Nov 2012 | A1 |
20130047252 | Johnson et al. | Feb 2013 | A1 |
20130247171 | Bandyopadhyay et al. | Sep 2013 | A1 |
Number | Date | Country |
---|---|---|
1957355 | May 2007 | CN |
101371258 | Feb 2009 | CN |
201569981 | Sep 2010 | CN |
102067150 | May 2011 | CN |
201821481 | May 2011 | CN |
10024179 | Nov 2001 | DE |
2150915 | Oct 2010 | EP |
2466513 | Jun 2012 | EP |
2466518 | Jun 2012 | EP |
2003091509 | Mar 2003 | JP |
2003271965 | Sep 2003 | JP |
2007094613 | Apr 2007 | JP |
2008217716 | Sep 2008 | JP |
2010097340 | Apr 2010 | JP |
2010211433 | Sep 2010 | JP |
100856919 | Sep 2008 | KR |
20110044131 | Apr 2011 | KR |
201026004 | Jul 2010 | TW |
WO-0177792 | Oct 2001 | WO |
WO-03048909 | Jun 2003 | WO |
WO-2009022242 | Feb 2009 | WO |
WO 2009022242 | Feb 2009 | WO |
WO-2010005662 | Jan 2010 | WO |
WO-2011100017 | Aug 2011 | WO |
Entry |
---|
“Extended European Search Report”, EP Application No. 11866405.1, dated Oct. 1, 2014, 7 pages. |
“Foreign Notice of Allowance”, RU Application No. 2013152162, dated Feb. 2, 2016, 17 pages. |
“Foreign Notice of Allowance”, TW Application No. 101118546, dated Mar. 7, 2016, 4 pages. |
“Foreign Office Action”, CL Application No. 3341-2013, dated Feb. 9, 2016, 6 pages. |
“Foreign Office Action”, CN Application No. 201180071045.2, dated Feb. 1, 2016, 19 pages. |
“Foreign Office Action”, CO Application No. 13-299.227, dated Nov. 20, 2015, 10 pages. |
“Foreign Office Action”, JP Application No. 2014-512815, dated Jan. 26, 2016, 8 pages. |
“Foreign Office Action”, JP Application No. 2014-512815, dated Jul. 28, 2015, 7 pages. |
De“VIP: a visual approach to user authentication”, Proceedings of the Working Conference on Advanced Visual Interfaces, May 22, 2002, 8 pages. |
Weiss,“PassShapes—Utilizing Stroke Based Authentication to Increase Password Memorability”, In NordiCHI 2008: Proceedings of the 5th Nordic Conference on Human-Computer Interaction, Oct. 2008, 10 pages. |
“Foreign Office Action”, CN Application No. 201180071045.2, dated Aug. 16, 2016, 14 pages. |
“Foreign Office Action”, TW Application No. 105111115, dated Sep. 21, 2016, 9 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/156,044, dated Oct. 7, 2016, 11 pages. |
“Foreign Notice of Allowance”, JP Application No. 2014-512815, dated Jun. 7, 2016, 4 pages. |
“Advisory Action”, U.S. Appl. No. 12/485,952, dated Oct. 12, 2012, 3 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 13/656,594, dated Nov. 13, 2014, 2 pages. |
“Final Office Action”, U.S. Appl. No. 10/187,311, dated May 1, 2006, 21 pages. |
“Final Office Action”, U.S. Appl. No. 11/073,742, dated Mar. 25, 2010, 14 pages. |
“Final Office Action”, U.S. Appl. No. 11/073,742, dated May 12, 2009, 17 pages. |
“Final Office Action”, U.S. Appl. No. 12/485,952, dated Jul. 24, 2012, 22 pages. |
“Final Office Action”, U.S. Appl. No. 13/163,201, dated May 2, 2013, 32 pages. |
“Final Office Action”, U.S. Appl. No. 13/409,877, dated Jan. 23, 2013, 7 pages. |
“Final Office Action”, U.S. Appl. No. 13/656,594, dated May 30, 2013, 19 pages. |
“Final Office Action”, U.S. Appl. No. 13/890,019, dated Apr. 27, 2015, 12 pages. |
“Final Office Action”, U.S. Appl. No. 14/156,044, dated Mar. 17, 2016, 19 pages. |
“Final Office Action”, U.S. Appl. No. 14/156,044, dated Apr. 3, 2015, 20 pages. |
“First Examination Report”, NZ Application No. 618124, dated May 14, 2014, 2 pages. |
“Foreign Notice of Allowance”, AU Application No. 2011202415, dated Mar. 28, 2012, 3 pages. |
“Foreign Notice of Allowance”, EP Application No. 09711593.5, dated Aug. 6, 2012, 8 pages. |
“Foreign Notice of Allowance”, NZ Application No. 618124, dated Aug. 22, 2014, 1 page. |
“Foreign Office Action”, AU Application No. 2011202415, dated Nov. 24, 2011, 2 pages. |
“Foreign Office Action”, AU Application No. 2011202415, dated Feb. 22, 2012, 1 page. |
“Foreign Office Action”, CO Application No. 13-299.227, dated Sep. 24, 2014, 11 pages. |
“Foreign Office Action”, TW Application No. 101118546, dated Nov. 30, 2015, 9 Pages. |
“International Search Report”, Application No. PCT/US10/58825, dated Feb. 2, 2011, 1 page. |
“International Search Report”, Application No. PCT/US2009/032771, dated Aug. 27, 2009, 3 pages. |
“International Search Report”, Application No. PCT/US2011/055507, dated Apr. 10, 2012, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 10/187,311, dated Oct. 17, 2006, 19 pages. |
“Non-Final Office Action”, U.S. Appl. No. 10/187,311, dated Nov. 7, 2005, 17 pages. |
“Non-Final Office Action”, U.S. Appl. No. 11/073,742, dated Aug. 25, 2010, 14 pages. |
“Non-Final Office Action”, U.S. Appl. No. 11/073,742, dated Oct. 14, 2009, 14 pages. |
“Non-Final Office Action”, U.S. Appl. No. 11/073,742, dated Oct. 16, 2008, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 11/775,159, dated Jul. 23, 2009, 17 pages. |
“Non-Final Office Action”, U.S. Appl. No. 11/775,159, dated Dec. 18, 2008, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 12/034,254, dated Dec. 22, 2010, 11 pages. |
“Non-Final Office Action”, U.S. Appl. No. 12/485,952, dated Dec. 1, 2011, 14 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/163,201, dated Oct. 10, 2012, 28 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/409,877, dated Aug. 1, 2012, 6 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/656,594, dated Jan. 4, 2013, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/656,594, dated Apr. 23, 2014, 23 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/890,019, dated Dec. 24, 2014, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/156,044, dated Sep. 30, 2015, 20 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/156,044, dated Nov. 18, 2014, 23 pages. |
“Notice of Allowance”, U.S. Appl. No. 10/187,311, Mar. 1, 2007, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 11/073,742, dated Feb. 22, 2011, 12 pages. |
“Notice of Allowance”, U.S. Appl. No. 11/775,159, dated Jan. 22, 2010, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 12/034,254, dated Jul. 6, 2011, 10 pages. |
“Notice of Allowance”, U.S. Appl. No. 12/034,254, dated Jul. 6, 2011, 11 pages. |
“Notice of Allowance”, U.S. Appl. No. 12/034,254, dated Aug. 18, 2011, 13 pages. |
“Notice of Allowance”, U.S. Appl. No. 12/485,952, dated Feb. 1, 2013, 6 pages. |
“Notice of Allowance”, U.S. Appl. No. 12/485,952, dated Nov. 29, 2012, 6 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/163,201, dated Sep. 24, 2013, 16 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/409,877, dated Aug. 30, 2013, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/656,594, dated Jul. 30, 2014, 10 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/890,019, dated Jan. 29, 2016, 13 pages. |
“Restriction Requirement”, U.S. Appl. No. 12/485,952, dated Sep. 2, 2011, 5 pages. |
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/034,254, dated Aug. 18, 2011, 10 pages. |
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/163,201, dated Nov. 26, 2013, 12 pages. |
“Ultra-Mobile PC Your life. At the touch of your finger.”, Retrieved at <http://www.microsoft.com/windows/products/winfamily/umpc/default.mspx> on Mar. 25, 2011, 1 page. |
“Written Opinion”, U.S. Appl. No. PCT/US2009/032771, dated Aug. 20, 2010, 3 pages. |
Angeli,“Usability and User Authentication: Pictorial Passwords vs. Pin”, Retrieved at <http://www.antonella_de_angeli.talktalk.neUfiles/Pdf/USABI LITY%20AN D%20USER%20AUTH ENTICATION%20PICTORIAL%20PASSWORDS%20VS%20PIN.pdf>, 2003, 6 pages. |
Ballard,“Generalizing the hough transform to detect arbitrary shapes”, Pattern Recognition, vol. 13, No. 2, 1981, pp. 111-122. |
Bishop,“Improving System Security via Proactive Password Checking”, Computers and Security, vol. 14, No. 3, 1995, pp. 233-249. |
Brostoff,“Are Passfaces More Usable than Passwords A Field Trial Investigation”, SIGSAC ACM Special Interest D Group on Security, Audit, and Control, 2001, 20 pages. |
Chalkias,“Multi-Grid Graphical Password Scheme”, Proceedings of the 6th International Conference on Artificial Intelligence and Digital Communications (AIDC), 2006, 2006, pp. 81-90. |
Curtis,“Computer Generated Watercolor”, Proceedings of the 24th annual conference on Computer graphics and interactive techniques, Aug. 1997, 10 pages. |
De“Is a Picture Really Worth a Thousand Words? Exploring the Feasibility of Graphical Authentication Systems”, International. Journal of Human-Computer Studies, 2005, 34 pages. |
Dhamija,“Deja Vu: A User Study Using Images for Authentication”, 9th USENIX security symposium, 2000, 14 pages. |
Dhamija,“Hash Visualization in User Authentication”, Proceedings of the Computer Human Interaction 2000 Conference, Apr. 2000, 2 pages. |
Diaz,“A Comparative Evaluation of Finger-Drawn Graphical Password Verification Methods”, 2010 12th International Conference on Frontiers in Handwriting Recognition, 2010, 6 pages. |
Dirik,“Modeling User Choice in the PassPoints Graphical Password Scheme”, ACM, Symposium on Usable Privacy and Security (SOUPS), Jul. 2007, 9 pages. |
Doja,“Virtual Password: Virtual Environment Based User Authentication”, Retrieved at <http://nguyendangbinh.org/Proceedings/IPCV08/Papers/SAM4029.pdf>, Jul. 14-17, 2008, 6 pages. |
Du,“A Scalable Integrated Region-Based Image Retrieval System”, 0-7803-6725-1 IEEE, Available at <http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=958943&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D958943>, 2001, 4 pages. |
Feldmeier,“UNIX Password Security Ten Years Later”, Proceedings of Crypto'89, published as Lecture Notes in Computer Science, No. 435, Springer Verlag, 1989, 20 pages. |
Hoff,“Fast Computation of Generalized Voronoi Diagrams Using Graphics Hardware”, ACM, 1999, 10 pages. |
Jansen,“Authenticating Mobile Device Users Through Image Selection”, Retrieved at <http://csrc.nist.gov/ groups/SN S/mobile_security/documents/mobile_d evices/PP-Visua1Authentication-rev-DS04. pdf>, May 2004, 10 pages. |
Jansen,“Picture Password: A Visual Login Technique for Mobile Devices”, Retrieved at <http://csrc.nist.gov/publications/nistir/nistir-7030.pdf>, Jul. 2003, 20 pages. |
Jermyn,“The Design and Analysis of Graphical Passwords”, Proceedings of the 8th USENIX Security Symposium, Washington, D.C., USA, Retrieved at <http://www.usenix.org/events/sec99/full_papers/jermyn/jermyn.pdf>, Aug. 23, 1999-Aug. 25, 1999, 15 pages. |
Juels,“At the Juncture of Cryptography and Humanity”, RSA Laboratories, 2002, pp. 1-4. |
Kara,“Hierarchical Parsing and Recognition of Hand-Sketched Diagrams”, In UIST '04: Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, 2004, 10 pages. |
Khandelwal,“User Authentication by Secured Graphical Password Implementation”, Retrieved at <http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4653531>, Nov. 25, 2010, 100-104. |
Klein,“Foiling the Cracker a Survey of and Improvements to, Password Security”, Proceedings of the Second USENIX Security Workshop, Aug. 2, 1990, 11 pages. |
Lipton,“Logical Authentication Methods”, 1986, pp. 9-20. |
Morris,“Password Security a Case History”, Communications of the ACM, vol. 22, No. 11, Nov. 1979, pp. 594-597. |
Nali,“Analyzing User Choice in Graphical Passwords”, <<http://www.scs.carleton.ca/research/5 tech_reports/2004/TR-04-01.pdf>> Technical Report, School of Information Technology an D Engineering, Univ of Ottawa, Canada, May 27, 2004, 6 pages. |
Naveen,“Password Management Using Doodles”, ICMI 2007 Proceedings of the 9th International Conference on Multimodal Interfaces, Nov. 15, 2007, 5 pages. |
Oka,“A New Cellular Automaton Structure for Macroscopic Linear-Curved Features Extraction”, Proceedings of the 4th International Joint Conference on Pattern Recognition, Kyoto, Japan, 1978, pp. 654-656. |
Oka,“Scribble-a-Secret: Similarity-Based Password Authentication Using Sketches”, 19th International Conference on Pattern Recognition, 2008, Retrieved at <http://figmenl.cse.usf.edu/-sfefilal/data/papers/ThAT9.40.pdf>, 2008, 4 pages. |
Paulson,“Taking a Graphical Approach to the Password”, News Briefs, Jul. 2002, 1 page. |
Perlin,“An Image Synthesizer”, Computer Graphics, vol. 19, No. 3, Jul. 1985, pp. 287-296. |
Perra,“A Framework for Image Based Authentication”, Retrieved at «http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1415456&isnumber=30651», Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, vol. 2, Mar. 2005, pp. 521-524. |
Perrig,“Hash Visualization a New Technique to Improve Real World Security”, Proceedings of the 1999 International Workshop on Cryptographic Techniques and E Commerce, 1999, pp. 131-138. |
Sobrado,“Graphical Passwords”, 2002, 8 pages. |
Sun,“An Interactive and Secure User Authentication Scheme for Mobile Devices”, Retrieved at <http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4542082>, IEEE International Symposium on Circuits and D Systems, May 2008, 2973-2976. |
Suo,“Graphical Passwords: A Survey”, found at «http://www.acsa-admin.org/2005/papers/89.pdf», 21st Annual Computer Security Applications Coni (ACSA05), 2005, 10 pages. |
Takada,“Awase-E: Image-based Authentication for Mobile Phones using User's Favorite Images”, Retrieved at: «http://www.netaro.info/-zetaka/publications/papers/awasee-MobileHCI03.pdf», Proceedings of 5th International Symposium on Human Computer Interaction with Mobile Devices and Services(Mobile HCI 03), Sep. 2003, 5 pages. |
Varenhorst,“Passdoodles: a Lightweight Authentication Method”, retrieved from the internet on May 25, 2011 at «http://people.csail.mil.edu/emax/papers/varenhorst.pdf», May 15, 2005, 14 pages. |
Venkatesan,“Robust Image Hashing”, IEEE 2000, 2000, 3 pages. |
Wang,“SIMPLIcity: Semantics-Sensitive Integrated Matching for Picture Libraries”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, No. 9, Available at <http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=955109&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F34%2F20661%2F00955109.pdf%3Farnumber%3D955109>, Sep. 2001, 17 pages. |
Wertheimer,“Laws of Organization in Perceptual Forms”, A Source Book of Gestait Psychology, Chapter 5, Routledge and Kegan Paul Ltd., 1938, pp. 71-88. |
Wiedenbeck,“Authentication Using Graphical Passwords: Effects of Tolerance and Image Choice”, found at 8 «http://www.gbtn.org/-chlee/research/phishing/p1-wiedenbeck.pdf », Symposium on Usable Privacy and Security D (SOUPS), Jul. 2005, 12 pages. |
Wood,“The Rorschach Inkblot Test: A Case of Overstatement”, Assessment, vol. 6, No. 4, pp. 341-351. |
Yan,“A Note on Proactive Password Checking”, ACM, 2002, 9 pages. |
Yan,“The memorability and security of passwords—some empirical results”, Technical Report, University of Cambridge, Computer Laboratory, Sep. 2000, 13 pages. |
“Foreign Office Action”, CA Application No. 2,836,052, dated Jul. 31, 2017, 4 pages. |
“Drawing Passwords”, IBM Technical Disclosure Bulletin, International Business Machines Corp, vol. 40, No. 5,, May 1, 1997, pp. 201-203. |
“Extended European Search Report”, EP Application No. 09711593, dated Jun. 9, 2011, 7 pages. |
“Foreign Office Action”, Application No. PH/1/2013/502354, dated Dec. 14, 2015, 2 pages. |
“Foreign Office Action”, CL Application No. 3341-2013, dated Jul. 14, 2015, 6 pages. |
“Foreign Office Action”, CN Application No. 201180071045.2, dated Mar. 8, 2017, 12 pages. |
“Foreign Office Action”, CN Application No. 201180071045.2, dated May 25, 2015, 17 pages. |
“Foreign Office Action”, KR Application No. 10-2013-7031038, dated Jul. 10, 2017, 16 pages. |
Number | Date | Country | |
---|---|---|---|
20160275305 A1 | Sep 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12485952 | Jun 2009 | US |
Child | 13890019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13890019 | May 2013 | US |
Child | 15158406 | US |