The present disclosure is related generally to user authentication techniques on electronic devices.
Although the potential advantages of using biometric authentication over traditional personal identification number (“PIN”) authentication have long been understood, its use in consumer electronic devices has only recently become popular. With biometric authentication, a user does not need to enter a PIN and, under the right conditions, does not even need to touch the device in order to unlock it.
Most existing biometric authentication schemes use the same basic access logic that traditional PIN-based systems use. That is, a user is either authenticated or is not. The user either gains full access or no access. Furthermore, they generally do not adjust in real-time for dynamic conditions such as the movement and position of the user.
While the appended claims set forth the features of the present techniques with particularity, these techniques, together with their objects and advantages, may be best understood from the following detailed description taken in conjunction with the accompanying drawings of which:
According to various embodiments of the disclosure, an electronic device (also referred to as “the device”) is able to alter one or more settings of its imager (e.g., its camera) based on the motion of a user that the device is attempting to authenticate. In an embodiment, the device captures a first set of image data of the user (e.g., a moving video or still image of the user), alters a setting of the imager based on the motion, captures a second set of image data of the user, and authenticates the user based on the second set of image data.
According to an embodiment of the disclosure, the device grants the user a first level of access to the device based on the first set of image data and grants the user second level of access to the device based on the second set of image data. The number of possible access levels is not limited, and the example of two levels discussed herein is only meant to be illustrative. Additionally, the electronic device may capture the two sets of image data with two different imagers, stitch the sets of image data together, and carry out authentication on the stitched sets of image data. The number of imagers that may be used is not limited to two, however.
Turning to
Set within the front side 104 of the housing 102 is a display 108 (e.g., an organic light-emitting diode display) and a fifth imager 110E (e.g., a front facing camera). Set within the rear side 106 of the housing 102 is a sixth imager 110F (e.g., a rear facing camera). Although depicted in
Turning to
Turning to
In this scenario, the processor 202 determines, with at least a 50% confidence level (based on its authentication attempt with the first set of image data) that the user 302 is an authorized user. Based on this determination, the processor 202 grants the user 302 a first level of access to the device 100. The first level of access may involve granting the user 302 access to telephone functions or lower security applications of the device 100. For example, the processor 202 may control the audio output 206 to inform that user 302 that “You missed two phone calls and have one voicemail.” The processor 202 may also control the display 108 to display the user's access level (e.g., “You are now able to access the phone functions”).
The processor 202 continues to receive data (position, motion, speed, and context) from the first motion sensor 116A. The processor 202 analyzes the data from the first motion sensor 116A. At block 404, the processor 202 alters a setting of the first imager 116A based on the detected motion. For example, the processor 202 may determine, based on the detected motion, that the user 302 is moving at or above a certain speed threshold (e.g., 3 feet per second), and, based on this fact, may increase the frame rate of the first imager 110A (e.g., from 20 frames per second (“fps”) to 50 fps). This increase in frame rate allows the first imager 110A to obtain more detail about the user in order to compensate for the fact that the user 302 is now in motion or is now moving faster. Other ways that the processor 202 can alter a setting of the first imager 110A include controlling the first imager 110A to change one or more of its shutter speed, shutter timing, illumination setting, resolution, aperture, and zoom setting. In various embodiments, any or all of these changes may be triggered by the same motion sensor that prompted the processor 202 to turn on the first imager 202.
After the processor 202 alters the setting, the processor 202 controls the first imager 110A to capture a second set of image data of the user 302 (block 406) and provide the second set of image data to the processor 202. For example, the processor 202 may receive a second moving video of the user from the first imager 110A, this time at the higher frame rate.
In this example, it is assumed that the processor 202 is able to use the second set of image data (e.g., the second, higher-frame-rate moving video) to authenticate the user 302 (block 408). For example, the processor 202 may authenticate the user 302 with a high enough confidence level to grant the user 302 a second level of access. The processor 214 grants the user 302 the second level of access to the device 100 based on the second set of image data. Granting the second level of access may involve the processor 202 granting the user 302 access to one or more of pictures, files, emails, or higher security applications on the device 100. The processor 202 may also control the display 108 to display the user's access level (e.g., “You are now able to access email”).
In another embodiment, the device 100 uses multiple imagers to gradually authenticate a user. Referring to
As shown in
The processor 202 then receives data regarding the user, including the user's position, motion (including the user's gait), speed, and context, from the second motion sensor 116B. The processor 202 analyzes the data from the second motion sensor 116B and, based on this motion data (and possibly based on further data from the first motion sensor 116A) determines that the user 502 has moved within viewing range of the second imager 110B. The processor 202 reacts by turning on the second imager 110B and controlling the second imager 110B to capture a second set of image data of the user 502 (block 606). The controller 202 then stitches the first set of image data and the second set of image data together (block 608). This stitching process allows the processor 202 to get a more comprehensive view of the user 502 and attempt to authenticate the user 502 on that basis. At block 610, the processor 202 grants the user 502 a second level of access to the electronic device 100 based on the stitched first and second sets of image data. In doing so, the processor 202 may also use the stitched images to assess the environment surrounding the device 100—such as the walls, ceiling, room settings, and table—and grant a level access to the user if the processor 202 determines that the environment is specific to the user (the user's house, office, car, etc.) The processor 202 can also use the surrounding environment to reinforce the biometric data (i.e., the user's gait, etc.) collected regarding the user. In this scenario, the combination of the environmental authentication and the biometric authentication is enough for the processor 202 to raise the level of access from a first level to a second level at block 610.
The process described in conjunction with
Furthermore, the process described in conjunction with
It should be understood that the embodiments described herein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each embodiment should typically be considered as available for other similar features or aspects in other embodiments.
While one or more embodiments of the have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from their spirit and scope of as defined by the following claims. For example, the steps of the flow diagrams of
This application is a continuation of U.S. patent application Ser. No. 14/276,107, filed May 13, 2014, having inventors Rachid M. Alameh et al., entitled “Electronic Device with Method for Controlling Access to Same”, commonly assigned to the assignee of the present application, which is hereby incorporated by reference.
| Number | Name | Date | Kind |
|---|---|---|---|
| 6278884 | Kim | Aug 2001 | B1 |
| 6707487 | Aman et al. | Mar 2004 | B1 |
| 8364971 | Bell | Jan 2013 | B2 |
| 8594374 | Bozarth | Nov 2013 | B1 |
| 8646060 | Ben Ayed | Feb 2014 | B1 |
| 8693726 | Karakotsios | Apr 2014 | B2 |
| 8856541 | Chaudhury | Oct 2014 | B1 |
| 8886953 | Sipe | Nov 2014 | B1 |
| 8966613 | Horvitz | Feb 2015 | B2 |
| 8983201 | Ran | Mar 2015 | B2 |
| 9066125 | Sands | Jun 2015 | B2 |
| 9147061 | McClendon | Sep 2015 | B1 |
| 9158904 | Ross | Oct 2015 | B1 |
| 9195817 | Scully-Power | Nov 2015 | B2 |
| 9202105 | Wang | Dec 2015 | B1 |
| 9213817 | Chatterton | Dec 2015 | B2 |
| 9280652 | Bozarth | Mar 2016 | B1 |
| 9323912 | Schultz | Apr 2016 | B2 |
| 9408076 | Chen | Aug 2016 | B2 |
| 9424418 | Li et al. | Aug 2016 | B2 |
| 9510196 | Sensharma | Nov 2016 | B2 |
| 9519769 | Azar | Dec 2016 | B2 |
| 9552684 | Bacco | Jan 2017 | B2 |
| 9576121 | Cao | Feb 2017 | B2 |
| 9600304 | DiVincent | Mar 2017 | B2 |
| 9626493 | Cohen | Apr 2017 | B2 |
| 9690480 | Ferren | Jun 2017 | B2 |
| 9710629 | Alameh | Jul 2017 | B2 |
| 9710691 | Hatcher | Jul 2017 | B1 |
| 9721175 | Kursun | Aug 2017 | B2 |
| 9747428 | Tartz | Aug 2017 | B2 |
| 9760383 | DiVincent | Sep 2017 | B2 |
| 9760785 | Kursun | Sep 2017 | B2 |
| 9778842 | Ferren | Oct 2017 | B2 |
| 9785763 | Scully-Power | Oct 2017 | B2 |
| 20020107649 | Takiguchi | Aug 2002 | A1 |
| 20020109863 | Monroe | Aug 2002 | A1 |
| 20030053662 | Evoy | Mar 2003 | A1 |
| 20050232470 | Chaudhari et al. | Oct 2005 | A1 |
| 20060058920 | Matsunaga et al. | Mar 2006 | A1 |
| 20060081771 | Eliad Wardimon | Apr 2006 | A1 |
| 20070005988 | Zhang et al. | Jan 2007 | A1 |
| 20070168677 | Kudo et al. | Jul 2007 | A1 |
| 20080220809 | Hansen | Sep 2008 | A1 |
| 20080225120 | Stuecker | Sep 2008 | A1 |
| 20090083847 | Fadell | Mar 2009 | A1 |
| 20090088204 | Culbert | Apr 2009 | A1 |
| 20100134310 | Zheng et al. | Jun 2010 | A1 |
| 20100205667 | Anderson et al. | Aug 2010 | A1 |
| 20100299530 | Bell | Nov 2010 | A1 |
| 20110037866 | Iwamoto | Feb 2011 | A1 |
| 20120038796 | Posa et al. | Feb 2012 | A1 |
| 20120046012 | Foruntanpour et al. | Feb 2012 | A1 |
| 20120287031 | Valko et al. | Nov 2012 | A1 |
| 20120287035 | Valko et al. | Nov 2012 | A1 |
| 20120315016 | Fung | Dec 2012 | A1 |
| 20130004016 | Karakotsios | Jan 2013 | A1 |
| 20130055348 | Strauss | Feb 2013 | A1 |
| 20130086674 | Horvitz | Apr 2013 | A1 |
| 20130179965 | Li et al. | Jul 2013 | A1 |
| 20130208103 | Sands | Aug 2013 | A1 |
| 20130227651 | Schultz et al. | Aug 2013 | A1 |
| 20130267204 | Schultz | Oct 2013 | A1 |
| 20130287031 | Ge | Oct 2013 | A1 |
| 20130322705 | Wong | Dec 2013 | A1 |
| 20130326613 | Kochanski | Dec 2013 | A1 |
| 20140013422 | Janus et al. | Jan 2014 | A1 |
| 20140059673 | Azar | Feb 2014 | A1 |
| 20140075548 | Sampathkumaran et al. | Mar 2014 | A1 |
| 20140089243 | Oppenheimer | Mar 2014 | A1 |
| 20140112550 | Hanna | Apr 2014 | A1 |
| 20140118520 | Slaby | May 2014 | A1 |
| 20140123275 | Azar | May 2014 | A1 |
| 20140130127 | Toole et al. | May 2014 | A1 |
| 20140137191 | Goldsmith et al. | May 2014 | A1 |
| 20140197922 | Stanwood et al. | Jul 2014 | A1 |
| 20140219515 | Karakotsios | Aug 2014 | A1 |
| 20140230047 | Scully-Power et al. | Aug 2014 | A1 |
| 20140250523 | Savvides et al. | Sep 2014 | A1 |
| 20140282271 | Lu et al. | Sep 2014 | A1 |
| 20140289834 | Lindemann | Sep 2014 | A1 |
| 20140310801 | Juhani | Oct 2014 | A1 |
| 20140330560 | Venkatesha | Nov 2014 | A1 |
| 20140333413 | Kursun | Nov 2014 | A1 |
| 20140333414 | Kursun | Nov 2014 | A1 |
| 20140337949 | Hoyos | Nov 2014 | A1 |
| 20140362231 | Bietsch et al. | Dec 2014 | A1 |
| 20140366128 | Venkateswaran | Dec 2014 | A1 |
| 20140366159 | Cohen | Dec 2014 | A1 |
| 20150026797 | Cao | Jan 2015 | A1 |
| 20150067823 | Chatterton | Mar 2015 | A1 |
| 20150071508 | Boshra | Mar 2015 | A1 |
| 20150177842 | Rudenko | Jun 2015 | A1 |
| 20150186628 | Bush | Jul 2015 | A1 |
| 20150186711 | Baldwin | Jul 2015 | A1 |
| 20150193611 | Zhao | Jul 2015 | A1 |
| 20150205622 | DiVincent | Jul 2015 | A1 |
| 20150205623 | DiVincent | Jul 2015 | A1 |
| 20150213245 | Tartz | Jul 2015 | A1 |
| 20150220931 | Alsina | Aug 2015 | A1 |
| 20150221151 | Bacco | Aug 2015 | A1 |
| 20150264567 | Sensharma | Sep 2015 | A1 |
| 20150332032 | Alameh et al. | Nov 2015 | A1 |
| 20150347732 | Alameh et al. | Dec 2015 | A1 |
| 20160026884 | Ferren | Jan 2016 | A1 |
| 20160034901 | Ferren | Feb 2016 | A1 |
| 20160071111 | Wang et al. | Mar 2016 | A1 |
| 20160203306 | Boshra | Jul 2016 | A1 |
| 20160226865 | Chen et al. | Aug 2016 | A1 |
| 20170032114 | Turgeman | Feb 2017 | A1 |
| 20170063852 | Azar | Mar 2017 | A1 |
| 20170076077 | Zhao | Mar 2017 | A1 |
| 20170109950 | Bacco | Apr 2017 | A1 |
| 20170169204 | Fadell | Jun 2017 | A1 |
| 20170199997 | Fadell | Jul 2017 | A1 |
| Number | Date | Country |
|---|---|---|
| 102662554 | Sep 2012 | CN |
| 103761463 | Apr 2014 | CN |
| 1990769 | Nov 2008 | EP |
| 2709031 | Mar 2014 | EP |
| 2474536 | Apr 2011 | GB |
| Entry |
|---|
| European Patent Office; International Search Report and Written Opinion; PCT Application No. PCT/US2015/030527; dated Sep. 2, 2015. |
| Chetty et al.; Multimedia Sensor Fusion for Retrieving Identity in Biometric Access Control System; ACM Transactions on Multimedia Computing, Communications and Application; vol. 3, No. 4; Article 26; pp. 1-21; Nov. 2010. |
| U.S. Patent and Trademark Office; Final Office Action; U.S. Appl. No. 14/276,107; dated Dec. 1, 2016. |
| U.S. Patent and Trademark Office; Non-Final Office Action; U.S. Appl. No. 14/276,107; dated Jul. 18, 2016. |
| U.S. Patent and Trademark Office; Final Office Action; U.S. Appl. No. 14/276,107; dated Feb. 1, 2016. |
| U.S. Patent and Trademark Office; Non-Final Office Action; U.S. Appl. No. 14/276,107; dated Jul. 31, 2015. |
| European Patent Office; EP Office Action; EP Application No. 15726444.1; dated Dec. 20, 2016. |
| U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Non-Final Office Action; dated Sep. 4, 2015. |
| U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Final Office Action; dated Mar. 15, 2016. |
| U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Non-Final Office Action; dated Mar. 23, 2017. |
| U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Final Office Action; dated Jul. 20, 2017. |
| International Search Report and Written Opinion; International Application No. PCT/US2015/032632; dated Sep. 14, 2015. |
| U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Non-Final Office Action; dated Mar. 13, 2018. |
| U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Final Office Action; dated Aug. 27, 2018. |
| Number | Date | Country | |
|---|---|---|---|
| 20170277876 A1 | Sep 2017 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | 14276107 | May 2014 | US |
| Child | 15618427 | US |