Electronic device with method for controlling access to same

Information

  • Patent Grant
  • 10255417
  • Patent Number
    10,255,417
  • Date Filed
    Friday, June 9, 2017
    8 years ago
  • Date Issued
    Tuesday, April 9, 2019
    6 years ago
Abstract
An electronic device is able to alter one or more settings of its imager based on the motion of a user that the device is attempting to authenticate. The electronic device, in one implementation, captures a first set of image data of the user (e.g., a video or still photo of the user), detects motion of the user, alters a setting of the imager based on the motion, captures a second set of image data of the user, and authenticates the user based on the second set of image data. In some implementations, the electronic device has multiple imagers, and activates one or more additional imagers based on the detected motion of the user.
Description
TECHNICAL FIELD

The present disclosure is related generally to user authentication techniques on electronic devices.


BACKGROUND

Although the potential advantages of using biometric authentication over traditional personal identification number (“PIN”) authentication have long been understood, its use in consumer electronic devices has only recently become popular. With biometric authentication, a user does not need to enter a PIN and, under the right conditions, does not even need to touch the device in order to unlock it.


Most existing biometric authentication schemes use the same basic access logic that traditional PIN-based systems use. That is, a user is either authenticated or is not. The user either gains full access or no access. Furthermore, they generally do not adjust in real-time for dynamic conditions such as the movement and position of the user.





DRAWINGS

While the appended claims set forth the features of the present techniques with particularity, these techniques, together with their objects and advantages, may be best understood from the following detailed description taken in conjunction with the accompanying drawings of which:



FIG. 1A is a front view of an electronic device according to an embodiment;



FIG. 1B is a rear view of an electronic device according to an embodiment;



FIG. 2 is a block diagram of the electronic device according to an embodiment;



FIG. 3 is a diagrammatic view of a scenario in which the electronic device may be used;



FIG. 4 is a process flow diagram of a method that may be carried out in an embodiment;



FIG. 5 is a diagrammatic view of another scenario in which the electronic device may be used; and



FIG. 6 is a process flow diagram of a method that may be carried out in another embodiment.





DESCRIPTION

According to various embodiments of the disclosure, an electronic device (also referred to as “the device”) is able to alter one or more settings of its imager (e.g., its camera) based on the motion of a user that the device is attempting to authenticate. In an embodiment, the device captures a first set of image data of the user (e.g., a moving video or still image of the user), alters a setting of the imager based on the motion, captures a second set of image data of the user, and authenticates the user based on the second set of image data.


According to an embodiment of the disclosure, the device grants the user a first level of access to the device based on the first set of image data and grants the user second level of access to the device based on the second set of image data. The number of possible access levels is not limited, and the example of two levels discussed herein is only meant to be illustrative. Additionally, the electronic device may capture the two sets of image data with two different imagers, stitch the sets of image data together, and carry out authentication on the stitched sets of image data. The number of imagers that may be used is not limited to two, however.


Turning to FIG. 1A and FIG. 1B, an embodiment of the electronic device (“the device”), generally labeled 100, includes a housing 102 having a front side 104 and a rear side 106. Set along the perimeter of the housing are a first imager 110A, a second imager 110B, a third imager 110C, and a fourth imager 110D. Each of the first through fourth imagers has a field of view that extends outwardly from the perimeter of the device 100. Also set along the perimeter of the device 100 are a first motion sensor 116A, a second motion sensor 116B, a third motion sensor 116C, and a fourth motion sensor 116D. Each motion sensor is configured to sense motion external to device 100. Each motion sensor may be implemented as a passive infrared detector, such as a digital thermopile sensor, or as an active sensor that uses reflected light of a light source of the device 100.


Set within the front side 104 of the housing 102 is a display 108 (e.g., an organic light-emitting diode display) and a fifth imager 110E (e.g., a front facing camera). Set within the rear side 106 of the housing 102 is a sixth imager 110F (e.g., a rear facing camera). Although depicted in FIGS. 1A and 1B as a smartphone, the electronic device 100 may be implemented as other types of devices, including a tablet computer, portable gaming device, and a wearable device (e.g., a smart watch).


Turning to FIG. 2, an embodiment of the electronic device 100 includes a processor 202, network communication hardware 204 (e.g., WiFi chip or a cellular baseband chipset), an audio output 206 (e.g., a speaker), a memory 208 (which can be implemented as volatile memory or non-volatile memory), and a light source 212 (e.g., an infrared light-emitting diode). In various embodiments, the processor 202 retrieves instructions and data from the memory 208 and, using the instructions and data, carries out the methods described herein. Each of the elements of FIG. 2 (including the elements of FIGS. 1A and 1B that appear in FIG. 2) is communicatively linked to one or more other elements via one or more data pathways 226. Possible implementations of the data pathways 226 include wires, conductive pathways on a microchip, and wireless connections. Possible implementations of the processor 202 include a microprocessor and a controller.


Turning to FIG. 3 and to the flowchart of FIG. 4, a procedure that the device 100 carries out to authenticate a user in an embodiment will now be described. As shown in FIG. 3, the electronic device 100 is lying on a table in a room 304. A user 302 of the device enters the room 104 at position A and is moving. When the user is at position A, the first motion sensor 116A detects the user 302 and provides data regarding the user to the processor 202 (FIG. 2), including data regarding the user's position, motion (including the user's gait), speed, and context. In response to receiving the data, the processor 202 turns on the first imager 110A and controls the first imager 110A to capture a first set of image data (i.e., a still image, multiple still images, or multiple images organized as a moving image) of the user 302 (block 402) and provides the first set of image data to the processor 202. The processor 202 attempts to authenticate the user 302 using the first set of image data. For example, the processor 202 may attempt to authenticate the user 302 based on biometric data, such as the user's body geometry (e.g., the user's body shape, gender, height, girth, and gait). Thus, if the processor 202 knows that an authorized user is a tall male, and the image data indicates that the user 302 is a tall male, then the processor 202 will determine that it is possible that the user 302 is the authorized user. Conversely, if the image data indicates that the user 302 is a short female, then the authentication will fail.


In this scenario, the processor 202 determines, with at least a 50% confidence level (based on its authentication attempt with the first set of image data) that the user 302 is an authorized user. Based on this determination, the processor 202 grants the user 302 a first level of access to the device 100. The first level of access may involve granting the user 302 access to telephone functions or lower security applications of the device 100. For example, the processor 202 may control the audio output 206 to inform that user 302 that “You missed two phone calls and have one voicemail.” The processor 202 may also control the display 108 to display the user's access level (e.g., “You are now able to access the phone functions”).


The processor 202 continues to receive data (position, motion, speed, and context) from the first motion sensor 116A. The processor 202 analyzes the data from the first motion sensor 116A. At block 404, the processor 202 alters a setting of the first imager 116A based on the detected motion. For example, the processor 202 may determine, based on the detected motion, that the user 302 is moving at or above a certain speed threshold (e.g., 3 feet per second), and, based on this fact, may increase the frame rate of the first imager 110A (e.g., from 20 frames per second (“fps”) to 50 fps). This increase in frame rate allows the first imager 110A to obtain more detail about the user in order to compensate for the fact that the user 302 is now in motion or is now moving faster. Other ways that the processor 202 can alter a setting of the first imager 110A include controlling the first imager 110A to change one or more of its shutter speed, shutter timing, illumination setting, resolution, aperture, and zoom setting. In various embodiments, any or all of these changes may be triggered by the same motion sensor that prompted the processor 202 to turn on the first imager 202.


After the processor 202 alters the setting, the processor 202 controls the first imager 110A to capture a second set of image data of the user 302 (block 406) and provide the second set of image data to the processor 202. For example, the processor 202 may receive a second moving video of the user from the first imager 110A, this time at the higher frame rate.


In this example, it is assumed that the processor 202 is able to use the second set of image data (e.g., the second, higher-frame-rate moving video) to authenticate the user 302 (block 408). For example, the processor 202 may authenticate the user 302 with a high enough confidence level to grant the user 302 a second level of access. The processor 214 grants the user 302 the second level of access to the device 100 based on the second set of image data. Granting the second level of access may involve the processor 202 granting the user 302 access to one or more of pictures, files, emails, or higher security applications on the device 100. The processor 202 may also control the display 108 to display the user's access level (e.g., “You are now able to access email”).


In another embodiment, the device 100 uses multiple imagers to gradually authenticate a user. Referring to FIGS. 5 and 6, a procedure for doing so will now be described.


As shown in FIG. 5, the electronic device 100 is lying on a table in a room 504. A user 502 of the device enters the room 504 at position A and is moving. When the user is at position A, the first motion sensor 116A detects the user 502 and provides data regarding the user, such as the user's position, motion (including the user's gait), speed, and context, to the processor 202 (FIG. 2). In response to receiving the data, the processor 202 turns on the first imager 110A and controls the first imager 110A to capture a first set of image data of the user 302 (block 602) and provides the first set of image data to the processor 202. The processor 202 attempts to authenticate the user 502 using the first set of image data. In this scenario, the processor 202 determines, with at least a 50% confidence level based on its authentication attempt with the image data that the user 502 is an authorized user. Based on this determination, the processor 202 grants the user 502 a first level of access to the device 100 (block 604).


The processor 202 then receives data regarding the user, including the user's position, motion (including the user's gait), speed, and context, from the second motion sensor 116B. The processor 202 analyzes the data from the second motion sensor 116B and, based on this motion data (and possibly based on further data from the first motion sensor 116A) determines that the user 502 has moved within viewing range of the second imager 110B. The processor 202 reacts by turning on the second imager 110B and controlling the second imager 110B to capture a second set of image data of the user 502 (block 606). The controller 202 then stitches the first set of image data and the second set of image data together (block 608). This stitching process allows the processor 202 to get a more comprehensive view of the user 502 and attempt to authenticate the user 502 on that basis. At block 610, the processor 202 grants the user 502 a second level of access to the electronic device 100 based on the stitched first and second sets of image data. In doing so, the processor 202 may also use the stitched images to assess the environment surrounding the device 100—such as the walls, ceiling, room settings, and table—and grant a level access to the user if the processor 202 determines that the environment is specific to the user (the user's house, office, car, etc.) The processor 202 can also use the surrounding environment to reinforce the biometric data (i.e., the user's gait, etc.) collected regarding the user. In this scenario, the combination of the environmental authentication and the biometric authentication is enough for the processor 202 to raise the level of access from a first level to a second level at block 610.


The process described in conjunction with FIG. 6 is not limited to two imagers. For example, if the user 502 continued to walk around the device 100, the third and fourth motion sensors 116C and 116D could detect the motion and signal the processor 202. The processor 202 could react by activating the third imager 110C and the fourth imager 110D, respectively, control the imagers to capture third and fourth sets of video data, and perform stitching (and possibly environmental analysis) in order to grant the second level of access, or even to grant further levels of access.


Furthermore, the process described in conjunction with FIG. 6 may also be carried out with sensors of the device 100, such as the motion sensors 116A-116D. For example, as the user walks around the device 100, the processor 202 may stitch the data from the first motion sensor 116A and the second motion sensor 116B together. The stitched data can be used, for example, to map the XY position of the user, and may be part of the basis upon which the processor 202 grants the first or second level of access.


It should be understood that the embodiments described herein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each embodiment should typically be considered as available for other similar features or aspects in other embodiments.


While one or more embodiments of the have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from their spirit and scope of as defined by the following claims. For example, the steps of the flow diagrams of FIGS. 4 and 6 can be reordered in way that will be apparent to those of skill in the art. Steps may also be added to the flow diagrams of FIGS. 4 and 6 without departing from the spirit of the disclosure.

Claims
  • 1. A method for controlling access to an electronic mobile device, the method carried out by the electronic mobile device comprising: activating an imager of the electronic mobile device, based on detected motion of a user, to capture a first set of image data of the user;carrying out a first authentication procedure on the user based on the user's body geometry with the first set of image data;granting the user a first level of access to the electronic mobile device based on the first set of image data;altering a setting of the imager to facilitate user authentication based on further occurrences of the detected motion of the user;capturing a second set of image data of the user with the imager after altering the setting of the imager;carrying out a second authentication procedure on the user based on the user's body geometry with the second set of image data; andgranting the user a second level of access to the electronic mobile device based on the second set of image data.
  • 2. The method of claim 1, wherein: granting the first level of access comprises granting the user access to telephone functions or lower security applications of the electronic mobile device, andgranting the second level of access comprises granting the user access to one or more of pictures, files, emails, and higher security applications on the electronic mobile device.
  • 3. The method of claim 1, wherein the user's body geometry is one or more of the user's body shape, body size, and gait.
  • 4. The method of claim 1, wherein the setting is one or more of the imager's frame rate, shutter speed, shutter timing, illumination, resolution, aperture, and zoom.
  • 5. The method of claim 1, further comprising changing an illumination intensity of a light source on the electronic mobile device based on further detected motion of the user.
  • 6. An electronic mobile device comprising: an imager; anda processor configured to: activate the imager, based on detected motion of a user, to capture a first set of image data of the user;carry out a first authentication procedure on the user based on the user's body geometry with the first set of image data;grant the user a first level of access to the electronic mobile device based on the first set of image data;alter a setting of the imager to facilitate user authentication based on further occurrences of the detected motion of the user;activate the imager to capture a second set of image data of the user based on the altered setting of the imager;carry out a second authentication procedure on the user based on the user's body geometry with the second set of image data; andgrant the user a second level of access to the electronic mobile device based on the second set of image data.
  • 7. The electronic mobile device of claim 6, wherein: the processor is further configured to grant the first level of access by granting the user access to telephone functions or lower security applications of the electronic mobile device, andthe processor is further configured to grant the second level of access by granting the user access to one or more of pictures, files, emails, and higher security applications on the electronic mobile device.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/276,107, filed May 13, 2014, having inventors Rachid M. Alameh et al., entitled “Electronic Device with Method for Controlling Access to Same”, commonly assigned to the assignee of the present application, which is hereby incorporated by reference.

US Referenced Citations (115)
Number Name Date Kind
6278884 Kim Aug 2001 B1
6707487 Aman et al. Mar 2004 B1
8364971 Bell Jan 2013 B2
8594374 Bozarth Nov 2013 B1
8646060 Ben Ayed Feb 2014 B1
8693726 Karakotsios Apr 2014 B2
8856541 Chaudhury Oct 2014 B1
8886953 Sipe Nov 2014 B1
8966613 Horvitz Feb 2015 B2
8983201 Ran Mar 2015 B2
9066125 Sands Jun 2015 B2
9147061 McClendon Sep 2015 B1
9158904 Ross Oct 2015 B1
9195817 Scully-Power Nov 2015 B2
9202105 Wang Dec 2015 B1
9213817 Chatterton Dec 2015 B2
9280652 Bozarth Mar 2016 B1
9323912 Schultz Apr 2016 B2
9408076 Chen Aug 2016 B2
9424418 Li et al. Aug 2016 B2
9510196 Sensharma Nov 2016 B2
9519769 Azar Dec 2016 B2
9552684 Bacco Jan 2017 B2
9576121 Cao Feb 2017 B2
9600304 DiVincent Mar 2017 B2
9626493 Cohen Apr 2017 B2
9690480 Ferren Jun 2017 B2
9710629 Alameh Jul 2017 B2
9710691 Hatcher Jul 2017 B1
9721175 Kursun Aug 2017 B2
9747428 Tartz Aug 2017 B2
9760383 DiVincent Sep 2017 B2
9760785 Kursun Sep 2017 B2
9778842 Ferren Oct 2017 B2
9785763 Scully-Power Oct 2017 B2
20020107649 Takiguchi Aug 2002 A1
20020109863 Monroe Aug 2002 A1
20030053662 Evoy Mar 2003 A1
20050232470 Chaudhari et al. Oct 2005 A1
20060058920 Matsunaga et al. Mar 2006 A1
20060081771 Eliad Wardimon Apr 2006 A1
20070005988 Zhang et al. Jan 2007 A1
20070168677 Kudo et al. Jul 2007 A1
20080220809 Hansen Sep 2008 A1
20080225120 Stuecker Sep 2008 A1
20090083847 Fadell Mar 2009 A1
20090088204 Culbert Apr 2009 A1
20100134310 Zheng et al. Jun 2010 A1
20100205667 Anderson et al. Aug 2010 A1
20100299530 Bell Nov 2010 A1
20110037866 Iwamoto Feb 2011 A1
20120038796 Posa et al. Feb 2012 A1
20120046012 Foruntanpour et al. Feb 2012 A1
20120287031 Valko et al. Nov 2012 A1
20120287035 Valko et al. Nov 2012 A1
20120315016 Fung Dec 2012 A1
20130004016 Karakotsios Jan 2013 A1
20130055348 Strauss Feb 2013 A1
20130086674 Horvitz Apr 2013 A1
20130179965 Li et al. Jul 2013 A1
20130208103 Sands Aug 2013 A1
20130227651 Schultz et al. Aug 2013 A1
20130267204 Schultz Oct 2013 A1
20130287031 Ge Oct 2013 A1
20130322705 Wong Dec 2013 A1
20130326613 Kochanski Dec 2013 A1
20140013422 Janus et al. Jan 2014 A1
20140059673 Azar Feb 2014 A1
20140075548 Sampathkumaran et al. Mar 2014 A1
20140089243 Oppenheimer Mar 2014 A1
20140112550 Hanna Apr 2014 A1
20140118520 Slaby May 2014 A1
20140123275 Azar May 2014 A1
20140130127 Toole et al. May 2014 A1
20140137191 Goldsmith et al. May 2014 A1
20140197922 Stanwood et al. Jul 2014 A1
20140219515 Karakotsios Aug 2014 A1
20140230047 Scully-Power et al. Aug 2014 A1
20140250523 Savvides et al. Sep 2014 A1
20140282271 Lu et al. Sep 2014 A1
20140289834 Lindemann Sep 2014 A1
20140310801 Juhani Oct 2014 A1
20140330560 Venkatesha Nov 2014 A1
20140333413 Kursun Nov 2014 A1
20140333414 Kursun Nov 2014 A1
20140337949 Hoyos Nov 2014 A1
20140362231 Bietsch et al. Dec 2014 A1
20140366128 Venkateswaran Dec 2014 A1
20140366159 Cohen Dec 2014 A1
20150026797 Cao Jan 2015 A1
20150067823 Chatterton Mar 2015 A1
20150071508 Boshra Mar 2015 A1
20150177842 Rudenko Jun 2015 A1
20150186628 Bush Jul 2015 A1
20150186711 Baldwin Jul 2015 A1
20150193611 Zhao Jul 2015 A1
20150205622 DiVincent Jul 2015 A1
20150205623 DiVincent Jul 2015 A1
20150213245 Tartz Jul 2015 A1
20150220931 Alsina Aug 2015 A1
20150221151 Bacco Aug 2015 A1
20150264567 Sensharma Sep 2015 A1
20150332032 Alameh et al. Nov 2015 A1
20150347732 Alameh et al. Dec 2015 A1
20160026884 Ferren Jan 2016 A1
20160034901 Ferren Feb 2016 A1
20160071111 Wang et al. Mar 2016 A1
20160203306 Boshra Jul 2016 A1
20160226865 Chen et al. Aug 2016 A1
20170032114 Turgeman Feb 2017 A1
20170063852 Azar Mar 2017 A1
20170076077 Zhao Mar 2017 A1
20170109950 Bacco Apr 2017 A1
20170169204 Fadell Jun 2017 A1
20170199997 Fadell Jul 2017 A1
Foreign Referenced Citations (5)
Number Date Country
102662554 Sep 2012 CN
103761463 Apr 2014 CN
1990769 Nov 2008 EP
2709031 Mar 2014 EP
2474536 Apr 2011 GB
Non-Patent Literature Citations (14)
Entry
European Patent Office; International Search Report and Written Opinion; PCT Application No. PCT/US2015/030527; dated Sep. 2, 2015.
Chetty et al.; Multimedia Sensor Fusion for Retrieving Identity in Biometric Access Control System; ACM Transactions on Multimedia Computing, Communications and Application; vol. 3, No. 4; Article 26; pp. 1-21; Nov. 2010.
U.S. Patent and Trademark Office; Final Office Action; U.S. Appl. No. 14/276,107; dated Dec. 1, 2016.
U.S. Patent and Trademark Office; Non-Final Office Action; U.S. Appl. No. 14/276,107; dated Jul. 18, 2016.
U.S. Patent and Trademark Office; Final Office Action; U.S. Appl. No. 14/276,107; dated Feb. 1, 2016.
U.S. Patent and Trademark Office; Non-Final Office Action; U.S. Appl. No. 14/276,107; dated Jul. 31, 2015.
European Patent Office; EP Office Action; EP Application No. 15726444.1; dated Dec. 20, 2016.
U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Non-Final Office Action; dated Sep. 4, 2015.
U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Final Office Action; dated Mar. 15, 2016.
U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Non-Final Office Action; dated Mar. 23, 2017.
U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Final Office Action; dated Jul. 20, 2017.
International Search Report and Written Opinion; International Application No. PCT/US2015/032632; dated Sep. 14, 2015.
U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Non-Final Office Action; dated Mar. 13, 2018.
U.S. Patent and Trademark Office; U.S. Appl. No. 14/289,978; Final Office Action; dated Aug. 27, 2018.
Related Publications (1)
Number Date Country
20170277876 A1 Sep 2017 US
Continuations (1)
Number Date Country
Parent 14276107 May 2014 US
Child 15618427 US