Embodiments disclosed herein are generally directed to detecting an object, and more particularly to capturing an image of the object using a sensor array coupled to a touch-sensitive screen.
Mobile devices are ubiquitous and may include a smartphone, tablet computer, personal digital assistant (PDA), portable game console, palmtop computer, wearable health monitor, and other portable electronic devices. A mobile device may be “locked,” preventing persons other than the owner of the mobile device from accessing it. The owner may set up a password on the mobile device and be authenticated by entering the password correctly into the mobile device, which may be inconvenient. Rather than have the user enter her password into the mobile device, it may be desirable to use bioinformatics such as fingerprint sensors to authenticate the user.
Many mobile devices available today have capacitive touchscreens, which typically use an insulator, such as glass or plastic, coated with one or more layers of patterned indium tin oxide (ITO) that serves as a transparent conductor. When a human finger touches or is positioned near an active touchscreen, the finger acts as a modest conductor to modify local electric fields. More specifically, when a finger touches the surface of a touchscreen, a distortion of the localized electric field occurs that may be measured as a change in local capacitance between adjacent ITO electrodes, which is then translated into an electrical signal by one or more associated integrated circuits and converted into touch data by algorithms running on one or more local processors.
Conventional capacitive touchscreens have difficulty acquiring fingerprint images because of inherent low resolution and inability to form clear images of fingerprint ridges and valleys, in part due to typical spacings between ITO electrodes that may be ten times that of typical fingerprint ridge-to-valley spacings, and in part due to the relatively shallow valleys of most fingerprints. Capacitance-based fingerprint sensors with higher resolution may work well with thin platens yet have difficulty imaging through typical thicknesses of a cover glass or cover lens of a mobile device display.
Methods, systems, and techniques for capturing one or more sensor images of an object are provided.
Consistent with some embodiments, there is provided a system for capturing one or more sensor images of an object. The system includes a touch system including a touch-sensitive screen and a display of a device. The system also includes a sensor system including a sensor array and a processing component. The sensor array is coupled to the touch-sensitive screen, and the processing component is configured to capture one or more images of an object when the object is detected by the touch-sensitive screen. At least a portion of the sensor array overlaps with at least a portion of the touch-sensitive screen.
Consistent with some embodiments, there is provided a method of capturing one or more sensor images of an object. The method includes detecting, by a sensor array coupled to a touch-sensitive screen of a device, signals reflected from the object with respect to the touch-sensitive screen. The method also includes capturing, based on the reflected signals, one or more images of the object. At least a portion of the sensor array overlaps with at least a portion of the touch-sensitive screen.
Consistent with some embodiments, there is provided a computer-readable medium having stored thereon computer-executable instructions for performing operations, including detecting, by a sensor array coupled to a touch-sensitive screen of a device, signals reflected from an object with respect to the touch-sensitive screen; and capturing, based on the reflected signals, one or more images of the object, wherein at least a portion of the sensor array overlaps with at least a portion of the touch-sensitive screen.
Consistent with some embodiments, there is provided a system for capturing one or more sensor images of an object. The system includes means for detecting signals reflected from the object with respect to a touch-sensitive screen of a device. The system also includes means for capturing one or more images of the object based on the reflected signals. When the object is located above the means for capturing the one or more images, the object is located above at least a portion of the touch-sensitive screen.
In the following description, specific details are set forth describing certain embodiments. It will be apparent, however, to one skilled in the art that the disclosed embodiments may be practiced without some or all of these specific details. The specific embodiments presented are meant to be illustrative, but not limiting. One skilled in the art may realize other material that, although not specifically described herein, is within the scope and spirit of this disclosure.
Object detection system 110 may be used to detect an object 104 such as a stylus or a finger of a user within a proximity of mobile device 102 and to capture one or more images of the object. Object detection system 110 may include a touch system 112 coupled to a sensor system 114 that work together to enhance the user's experience.
Touch system 112 includes a touch-sensitive screen and a visual display 109 of mobile device 102. The touch-sensitive screen, referred also to herein as a “touchscreen,” may be incorporated into the display or positioned above the display of mobile device 102. In some embodiments, the touch-sensitive screen is a resistive touch-sensitive screen that responds to pressure applied to the screen. In some embodiments, the touch-sensitive screen is optical, radio frequency (RF), infrared (IR) or some other type of sensor.
In some embodiments, the touch-sensitive screen may be a capacitive touchscreen. Capacitive touchscreens, in particular projected capacitive touch (PCT) screens, may use the conductive and dielectric properties of a finger, stylus or other object along with arrays of transparent conductive electrodes and associated circuitry to determine the position and movement of object 104 (e.g., one or more of a user's finger or stylus) across the screen. As such, touch system 112 may use capacitive sensing technology to detect the location of object 104 by measuring small currents or displaced charges as a finger or other object 104 traverses and distorts electric field lines between adjacent or overlapping conductive traces of the capacitive touchscreen. Capacitive touchscreens typically operate at low power and are an available feature in many mobile devices. Touch system 112 may be embedded, embodied, attached or otherwise incorporated into mobile device 102. Touch system 112 may have lower resolution than sensor system 114, and be incapable of receiving certain details about object 104.
Sensor system 114 may include a sensor array 116 and one or more processing components 132. Sensor array 116 may be coupled to the touch-sensitive screen and may reside underneath at least a portion of the display or the whole part of the display, and/or may be integrated and built into the display of mobile device 102. In some implementations, sensor array 116 may be coupled to the touch-sensitive screen with a coupling material such as an epoxy, a pressure-sensitive adhesive (PSA), or other adhesive material. In some implementations, sensor array 116 may be laminated or otherwise bonded to the backside of the touch-sensitive screen or to the backside of the visual display. In some implementations, sensor array 116 may be fabricated or otherwise formed behind or as part of the visual display, touch-sensitive screen, or cover glass that may reside in front of the display. In some implementations, the sensor array may overlap some or all of the display and/or touchscreen.
Sensor array 116 may include one or more transmitters for transmitting signals and one or more receivers for picking up or receiving signals transmitted by the transmitters. Sensor array 116 may be, for example, an ultrasonic sensor array, capacitive sensor array, optical sensor array, radio frequency (RF) sensor array, infrared (IR) sensor array, force-sensitive resistor (FSR) array, or other type of sensor array. A quantity of receivers and transmitters (neither is illustrated) included in sensor array 116 may depend on its size. For example, sensor array 116 may be approximately 3.2 inches×3.0 inches, 1 inch×1 inch, 11 millimeters (mm)×11 mm, 15 mm×6 mm, 9 mm×4 mm, or 4 mm×4 mm. These are merely examples, and the size of sensor array 116 may vary.
In some embodiments, the transmitters may transmit a signal pattern of ultrasonic waves, and object 104 may be within a proximity of or may be positioned on or over a surface of the touch-sensitive screen, causing the ultrasonic waves to reflect back toward the sensor array. In an example, the transmitters transmit an ultrasonic signal. In this example, the transmitters may be any suitable ultrasonic device that includes one or more ultrasonic transducers such as a piezoelectric ultrasonic plane wave generator to generate ultrasonic signals. The receivers may receive the reflected signal pattern from object 104 and may be any suitable ultrasonic receiver. The receivers may continuously run such that they are always ready to receive input from the transmitters when mobile device 102 is turned on.
Processing component 132 may perform various operations of activating and accessing the sensor array and determining a position of object 104 based on the reflected signal pattern. Processing component 132 may extract ultrasonic signals received, detected and captured by the receivers of sensor array 116 and track the movement of object 104 to detect relatively accurate positions of object 104. Processing component 132 may capture or extract images (e.g., images based on ultrasonic, capacitive, optical, RF, IR, or FSR technologies) of the object.
Although sensor system 114 may be described as an ultrasonic sensor system and processing component 132 may be described as processing ultrasonic signals and capturing an ultrasonic image, this is not intended to be limiting. Sensor system 114 may be, for example, a capacitive sensor array, optical sensor array, radio frequency (RF) sensor array, infrared sensor array, force-sensitive resistor array, or other type of sensor array.
In some embodiments, processing component 132 may extract ultrasonic signals detected by the ultrasonic sensor array and determine whether to store the one or more ultrasonic images (which may include fingerprint, blood vessel structure, sweat pore details, etc.) of the object. Processing component 132 may store the ultrasonic image if it meets a certain threshold, for example if it is clear, and discard it if it is unclear. In some embodiments, processing component 132 may include one or more processors, central processing units (CPUs), image signal processors (ISPs), micro-controllers, or digital signal processors (DSPs), graphics processing units (GPUs), or audio signal processors, which may include analog and/or digital audio signal processors. Sensor system 114 may be embedded, embodied, attached, or otherwise incorporated into mobile device 102. In some implementations, sensor array 116 of sensor system 114 may be positioned underneath, incorporated into, or otherwise included with the touch-sensitive screen or the visual display of mobile device 102. In some implementations, the touch-sensitive screen may be positioned above, incorporated into, or otherwise included with the display of mobile device 102
Mobile device 102 also includes a memory 134. Memory 134 may include a system memory component, which may correspond to random access memory (RAM), an internal memory component, which may correspond to read only memory (ROM), and/or an external or static memory, which may correspond to optical, magnetic, or solid-state memories, for example. Memory 134 may correspond to a non-transitory machine-readable medium that includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, and/or any other medium from which processing component 132 is capable of reading.
A user may interact with touch system 112 by touching the touch-sensitive screen, which detects the position of object 104 on the surface of the touch-sensitive screen. When object 104 is positioned on or over sensor array 116, the transmitter (e.g., ultrasonic transmitter) included in sensor array 116 may be fired to acquire one or more images (e.g., ultrasonic images) of the object. The user may interact with touch system 112 before interacting with sensor system 114.
The display is fingerprint-detection enabled in the portion in which sensor array 116 overlaps with the touch-sensitive screen while the remaining non-overlapping portions are not fingerprint-detection enabled. Sensor system 114 may be referred to as a fingerprint sensor system, and sensor array 116 may be referred to as a fingerprint sensor.
In some implementations, the display may provide a prompt requesting that the user scan her finger in order to be authenticated. Mobile device 102 may allow the user access to applications and data stored in mobile device 102 if the user is authenticated, and prevent the user from accessing applications and data stored in mobile device 102 if the user is not authenticated.
The user may interact with the display by placing finger 204 on the surface of the touch-sensitive screen. At an action 210, touch system 112 may receive and process touch data including one or more touch parameters from the user's touch. Touch system 112 may pass one or more of the touch parameters to a fingerprint control block 220. The touch parameters may be used to derive how and when sensor system 114 should capture an image of the object.
Touch system 112 may derive touch parameters from the user's touch such as the touch size, area, location, x-y position, angle and orientation of the user's finger, movement and/or rate of movement of the user's finger, and the touch down time (e.g., the duration of time in which the user's finger is touching the touch-sensitive screen) of an object touching the touch-sensitive screen of the device. In one example, if the user's finger is still moving, it may be ineffective to capture an image of the user's fingerprint. A more ideal time to capture the image is when, for example, the user's finger is practically still and within a threshold proximity to the touch-sensitive screen (e.g., touching the touch-sensitive screen).
In another example, the touch parameters may be used to determine when a user's finger is positioned over the fingerprint sensor array 116 to allow timely capture and acquisition of fingerprint images and to prevent unnecessary firings or activations of the sensor array 116 when no finger is present, reducing overall power consumption by the mobile device 102. In another example, the touch parameters may be used to determine if the object is likely a finger, a palm or a stylus (e.g. based on the area or outline of the object against the touchscreen), and activate portions of the sensor array 116 accordingly. In another example, touch parameters that can detect multiple simultaneous touches (e.g., multi-touch capability) may be used to trigger portions of the sensor array 116 associated with the locations where multiple finger touches have been detected, to allow simultaneous fingerprinting of two, three, four or five fingers. In some implementations, the position and orientation of a finger may be detected by the touchscreen, and used to aid in enrollment and/or verification of the user with the fingerprint sensor system.
Touch system 112 may also derive touch parameters from the movement of or data related to mobile device 102 such as the movement rate of the mobile device, touch signal level data (e.g., threshold or signal to noise ratio (SNR)), or grip detection. Grip detection may refer to data regarding the user's grip on the mobile device. For example, touch system 112 may be able to realize which hand the user is using to hold the mobile device (e.g., right or left hand), where the user is gripping the mobile device (e.g., top, bottom, left- and/or right-side) and which hand is touching the screen (e.g., left or right hand is being used to select a particular number displayed on a keypad).
At an action 212, sensor system 114 may receive and process fingerprint data. In some embodiments, sensor system 114 may include fingerprint hardware and circuitry to process images captured from the ultrasonic waves reflected from the user's finger. Sensor system 114 may pass the fingerprint data to fingerprint control block 220. When finger 204 is positioned over sensor array 116, an ultrasonic transmitter included in sensor array 116 may be fired to acquire one or more ultrasonic images of finger 204.
At an action 214, sensor data and/or device status information may be retrieved and passed to fingerprint control block 220. Mobile device 102 may include a device sensor such as a temperature sensor (e.g., ambient temperature, phone temperature, and temperature of the sensor and/or touch-sensitive screen) and/or humidity sensor that provide the sensor data to allow for sensor compensation.
Mobile device 102 may also include one or more gyroscopes and accelerometers and/or a global positioning system (GPS) to determine, for example, whether the user is moving or how fast the user is moving. For example, if the user is running, it may be ineffective to capture an image of the user's fingerprint. In such an example, mobile device 102 may allow the user to unlock mobile device 102 but not allow the user to make a purchase online because the purchase may be unintentional on the user's part. A more ideal time to capture an image of the user's fingerprint is when, for example, the user's finger is practically still and within a threshold proximity to the touch-sensitive screen. Additionally, device status information such as cellular status, RF signal level (e.g., strong, medium, or weak signal), and/or battery level may be retrieved and passed to fingerprint control block 220.
D. Process Touch Data, Fingerprint Data, Sensor Data and/or Device Status Information
The touch data (see action 210), fingerprint data (see action 212), and sensor data and/or device status information (see action 214) may provide context into the user's movements and assist sensor system 114 in determining whether it is a suitable time to capture an image of the user's fingerprint or whether it may be desirable to identify different parameters processed by sensor system 114 in order to capture a different image of the fingerprint or capture a fingerprint image at a later time.
In the fingerprint control block 220, data may be synthesized and provide context and realizations in a variety of ways. At an action 222, parameters that may control fingerprint scanning may be derived. The touch data, fingerprint data, sensor data, and/or device status information may be synthesized in order to realize optimal fingerprint sensor operating parameters, and an output of action 222 may adjust the parameters of the fingerprint sensor system to obtain a high-quality fingerprint image. For example, at an action 224, optimal tuning parameters for the sensor system 114 and the optimal time to activate the scanning hardware may be determined and passed to object detection system 110.
At an action 226, data may be synthesized to provide real-time feedback to the user and passed to object detection system 110. In an example, real-time feedback with one or more recommendations for finger positional adjustments may be provided to the user via the display. In particular, the real-time feedback may provide suggestions to users on how to adjust their finger(s) or hand(s) to increase the probability of obtaining a good image of their fingers. For example, sensor system 114 may be able to determine where the user is touching, portions of the touchsereen or visual display to touch for capturing a clear fingerprint image, whether mobile device 102 is being jostled around too much, and/or whether a good fingerprint image has been acquired.
Object detection system 110 may provide, for example, visual, audible, and/or haptic feedback to the user via the system user interface. An example of visual feedback may be providing a visual symbol (e.g., a bounded box or an illustration of a fingerprint with the text “Touch Here” in close proximity to the bounded box or fingerprint illustration) on the display of mobile device 102 indicating where on the touch-sensitive screen the user should touch to enable the system to capture a good image of the user's fingerprint (see visual image 322 in
Another example of a visual display may be providing a prompt on the display of mobile device 102 indicating that an image of the user's fingerprint has been successfully captured or has not been successfully captured because of, for example, the movement of mobile device 102 or excessive movement of a finger during an image acquisition event. Another example of a visual display may be causing a green light emitting diode (LED) coupled to mobile device 102 to be lit in order to indicate that the user's fingerprint has been successfully captured, and causing a red LED coupled to mobile device 102 to be lit in order to indicate that the user's fingerprint has not been successfully captured. In some implementations, audio feedback such as a tone, sound or verbal response may be provided to the user. In some implementations, haptic feedback may be provided to the user, such as a buzz or click when a fingerprint image is being acquired or when enrollment, matching or authentication has been successful.
At an action 228, parameters for enhanced record creation and management may be derived. When finger 204 touches the touch-sensitive screen, sensor array 116 may be an ultrasonic sensor array that is fired and scans finger 204. In this example, processing component 132 acquires one or more ultrasonic images of the user's fingerprint and may store the fingerprint images or associated fingerprint features in memory 134 (see
If processing component 132 is aware of which hand (e.g., the right or left hand), finger (e.g., the thumb, index, middle, ring or pinky finger), and/or part of the user's finger (e.g., the tip, middle, side or bottom) is being used to create the fingerprint image, processing component 132 may be able to more easily match the user's fingerprint image or features with another fingerprint image or features stored in memory 134.
It may be desirable to minimize the size of sensor array 116 to reduce costs. If sensor array 116 is small, a small number of minutiae or features of the user's finger may be captured and a determination of whether the user's fingerprint image matches another fingerprint image is made based on a small amount of data. Accordingly, to obtain more accurate and quicker results, it may be helpful to know which part of the user's finger was captured and represented by the fingerprint image.
At an action 230, data may be synthesized to enhance fingerprint record management to, for example, create, match, and/or authenticate the user's fingerprint image. If the user is setting up her account, processing component 132 may capture an image of finger 204, analyze the fingerprint image for minutiae and/or features, and store the minutiae and/or features as fingerprint template information (also known as a fingerprint “template”) in memory 134. The image may be based on sensor array 116's interaction with the object. At a later point in time, if processing component 132 determines that the captured fingerprint image of finger 204 matches a stored fingerprint template in memory 134, processing component 132 may authenticate the user. In contrast, if processing component 132 determines that the captured fingerprint image of finger 204 does not match any of the stored templates in memory 134, processing component 132 may determine that the authentication failed and not authenticate the user.
In
Touch-sensitive screen 302 may be made of a glass material and include a silicon on glass component 306 that resides below the touch-sensitive screen. In some embodiments, a display driver, PCT front-end (analog front end or AFE), and a fingerprint sensor AFE may be combined into silicon-on-glass component 306. Each AFE may include one or more analog-to-digital converters and associated timing circuitry for acquiring and converting data from the various sensors. As such, a high degree of interaction may occur between the display driver, touch AFE, and fingerprint sensor AFE without leaving the silicon-on-glass component 306. Although touch-sensitive screen 302 is described as being made of a glass material, it should be understood that touch system 112 may be made of any transparent material. For example, touch-sensitive screen 302 may be made of a polycarbonate or sapphire material.
A two-sided flex circuit 308 may be positioned between touch-sensitive screen 302 and sensor array 116. Sensor array 116 may reside on the flex, the glass, or otherwise coupled to the touchscreen. Two-sided flex circuit 308 is an electrical and physical interface, and may include two or more conductive layers with insulating layers between them. The outer conductive layers may have exposed pads that may be accessed from one or both sides of the flex. Sensor array 116 may use electrical connections on flex circuit 308 to provide power and retrieve fingerprint sensor data. Similarly, touch-sensitive screen 302 may use electrical connections on flex circuit 308 to generate and receive electrical signals for detecting one or more touches on the screen surface. Similarly, display 304 may use electrical connections on flex circuit 308 to provide power and receive display information flowing from a main printed circuit board (PCB) 320.
In one example, two-sided flex circuit 308 may feed electrical connections upward to display 304 via flex portion 308A and downward to sensor array 116 via flex portion 308B, so that data can be passed to and/or received from each other and main PCB 320. In another example, two-sided flex circuit 308 may feed electrical connections upward to display 304 via flex portion 308B and downward to sensor array 116 via portion flex 308A, so that data can be passed to and/or received from each other and main PCB 320. Additionally, plated-through holes in two-sided flex circuit 308 may provide connections between touch-sensitive screen 302 and sensor array 116.
Two-sided flex circuit 308 may include one or more silicon-on-flex components 310 that reside on one or both sides of flex circuit 308. In some embodiments, a fingerprint transmitter (Tx) driver section and one or more fingerprint/touch low-dropout (LDO) voltage regulators may be combined into silicon-on-flex component 310. The power supplies for the silicon-on-flex component 310 may be shared with other integrated circuits (ICs).
As discussed above, a visual prompt requesting the user to scan her finger for authentication may appear on display 304. The prompt may be accompanied by one or more audio tones, verbal commands, or haptic feedback to augment or validate the request. Main PCB 320 may operate and send data to display 304 and provide a visual image 322 to show the user where to place her finger on a surface of touch-sensitive screen 302 or display 304. Visual image 322 is shown as a dashed box and may indicate the outline of the active area of sensor array 116 coupled below it. In
Main PCB 320 may include processing component 132 having a chipset 324 with one or more mobile station modems (MSMs) and one or more codecs (coder-decoders) 326. Chipset 324 with one or more digital processors may perform fingerprint processing and/or control the display interface. Chipset 324 may perform action 214 (see
When touch system 112 is active, the user may place her finger on touch-sensitive screen 302 within the bounded dashed box of visual image 322 such that sensor system 114 is fired and processing component 132 captures an image of the user's fingerprint 332. Touch system 112 may process the user's touch and send the touch data (see action 210 in
One or more processors in chipset 324 may analyze touch data from touch-sensitive screen 302 and determine whether to capture an image of fingerprint 332. For example, if the user is moving and fingerprint 332 is not clear or is blurred, chipset 324 may determine to not capture an image of fingerprint 332. If chipset 324 determines that fingerprint 332 is a good candidate for capturing its image, chipset 324 may activate sensor array 116, which may then communicate with the fingerprint AFE to run a scan of fingerprint 332 with the touch parameter data and any adjustments. Sensor array 116 may scan fingerprint 332 for a predetermined period of time and send the fingerprint data to processing component 132 for processing. Processing component 132 may, for example, create a template or record of the fingerprint image for enrollment, or determine whether the templates or fingerprint image matches any of the fingerprint images stored in memory 134 (
In
The components residing in silicon-on-glass component 306 and silicon-on-flex component 310 may vary based on various implementations. In one example, the display driver and touch AFE may be combined into silicon-on-glass component 306, and fingerprint transmitter driver section, fingerprint/projected capacitive touch LDOs, and fingerprint AFE are combined into silicon-on-flex component 310. In another example, the display driver may be included in silicon-on-glass component 306, and the touch AFE, fingerprint transmitter driver section, fingerprint/projected capacitive touch LDOs, and fingerprint AFE are combined into silicon-on-flex component 310.
Additionally, more than one silicon component may reside on the glass of the touch-sensitive screen and more than one silicon component may reside on the flex circuit. If the touch AFE and fingerprint AFE reside in the same chip, they may use the same SPI. If the touch AFE and fingerprint AFE reside in different chips, they may each use their own SPI. By using two SPIs, it may be more challenging and expensive to run wires or other electrical connections between main PCB 320 and two-sided flex circuit 308, yet be more flexible.
In one example, the display driver may reside in silicon-on-glass component 306, and two different silicon components reside on the two-sided flex circuit 308. In another example, fingerprint transmitter driver section and fingerprint/touch LDOs may reside in a first silicon-on-flex component (not shown), and touch AFE and fingerprint AFE are combined into a second silicon-on-flex component (not shown). In another example, fingerprint transmitter driver section and fingerprint/projected capacitive touch LDOs reside in a first silicon-on-flex component, and touch AFE resides in a second silicon-on-flex component (not shown).
As discussed above and further emphasized here,
Method 600 includes blocks 602-604. In a block 602, signals reflected from an object with respect to a touch-sensitive screen of a device may be detected by a sensor array coupled to the touch-sensitive screen. In an example, sensor array 116 is an ultrasonic sensor array that detects ultrasonic signals reflected from object 104 with respect to the touch-sensitive screen of mobile device 102.
In a block 604, one or more images of the object may be captured based on the reflected signals, where at least a portion of the sensor array overlaps with at least a portion of the touch-sensitive screen. In an example, sensor array 116 is an ultrasonic sensor array and processing component 132 captures, based on reflected ultrasonic signals, one or more ultrasonic images of object 104, where at least a portion of the ultrasonic sensor array overlaps with at least a portion of the touch-sensitive screen.
It is also understood that additional processes may be performed before, during, or after blocks 602-604 discussed above. It is also understood that one or more of the blocks of method 600 described herein may be omitted, combined, or performed in a different sequence as desired. In an embodiment, blocks 602-604 may be performed for any number of objects hovering over or positioned on a surface of the touch-sensitive screen (e.g., multiple fingers).
Referring again to
In some embodiments, processing component 132 may turn sensor array 116 off or place sensor array 116 into a low-power mode such that the transmitters do not transmit signals and/or the receivers do not capture or otherwise process received signals. For example, processing component 132 may refrain from sending a transmitter enable signal or drive voltage to an ultrasonic sensor array 116, preventing ultrasonic waves from being generated until needed for ultrasonic imaging. Similarly, processing component 132 may place a receiver portion of the sensor array 116 into a low-power or sleep mode until an image is needed, reducing overall power consumption. Processing component 132 may also be placed in a sleep mode for a period of time. The touch-sensitive screen is typically active and used to track the movement of the object. In particular, the touch-sensitive screen may be used to determine when finger 204 has stopped moving and when finger 204 is positioned over an active area of sensor array 116. Acquired fingerprint images may be clearer and more precise if the user has stopped movement of finger 204 when images of the finger are captured.
Touch sensors and touch-sensitive screens may consume less power than sensor system 114, so that in some implementations, a touch sensor or touch-sensitive screen may be used to detect a finger or other object, which may in turn trigger processing component 132 to wake up the fingerprint sensor. For example, if the detected area and/or outline of an object positioned on the touch-sensitive screen is similar in size and shape to that of a finger rather than a stylus or the inside of a protective case, then processing component 132 may wake up and invoke sensor system 114 and sensor array 116. Further, if coordinates of the finger or finger outline are within an image capture area (active area) of the sensor array 116, then one or more sensor images of the finger may be captured.
Touch system 112 may detect finger 204 on the surface of the touch-sensitive screen and send a signal to processing component 132 regarding the detected finger. If portions of processing component 132 are asleep, needed portions may be woken up or taken out of a sleep mode. Processing component 132 may detect when finger 204 is within the fingerprint sensor area. In response to detecting that finger 204 is within the fingerprint sensor area, processing component 132 may turn sensor array 116 on such that one or more transmitters fire off signals (e.g., ultrasonic signals), and one or more receivers receive the reflected signal patterns from finger 204. An appropriate time to turn sensor array 116 on might be when the finger has stopped moving and has settled into a relatively stationary position that is above an active area of the fingerprint sensor array. Processing component 132 may then acquire and process the reflected signal patterns to capture one or more images of the finger.
In one example, mobile device 102 may be asleep with the touchscreen and display off. A user may place a finger on a designated area such as a home button to wake up mobile device 102. If the touch-sensitive screen includes one or more capacitive touch sensors near a periphery of the screen, the capacitive touch sensors may be used to sense only a portion of the touch-sensitive screen (e.g., region around and including sensor array 116) at a low repetition rate to save power. In another example, only validated touches in a designated area of a touch-sensitive screen where a fingerprint sensor array is located may be acted upon when the touchscreen and display are in a low-power mode (e.g., the touch-sensitive screen wakes up and switches to a higher scan rate of capacitive sense channels (rows and columns) only in the designated area, wakes up the fingerprint sensor array, and initiates an image acquisition cycle). Other touches may be ignored because the capacitive sense channels outside the designated area may be off or the touches are made in a region outside the designated sensor area.
In some implementations, sensor system 114 may switch between a capture mode and a non-capture mode. If sensor system 114 is in the capture mode, sensor system 114 is active and captures an image of the fingerprint when it is within a threshold proximity to the touch-sensitive screen. If sensor system 114 is in the non-capture mode, sensor system 114 is active and does not capture an image of the fingerprint (even if it is within the threshold proximity to the touch-sensitive screen).
In some implementations, if a user touches the fingerprint area on the display with a gloved hand, the touch may be registered but a preset threshold may indicate that the magnitude of the touch signal is too low for fingerprinting yet adequate for screen navigation. In such a scenario, touch system 112 may not engage the fingerprint sensor, reducing the amount of power consumed by not engaging the fingerprint sensor on every touch. In such an example, touches by a gloved finger may meet a minimum signal threshold to allow for screen navigation.
Accordingly, by leaving sensor array 116 off or in a low-power sleep mode when sensor images of an object are amenable to being unclear and turning sensor array 116 on when sensor images of the object are amenable to being clear, object detection system 110 may enable better power management of sensor system 114. In this way, the power demand of sensor system 114 may be reduced, and a higher probability of first-time scanning success ensured.
Sensor array 116 may be placed underneath a portion of the touch-sensitive screen (rather than underneath the entire touch-sensitive screen). With sensor array 116 coupled to the touch-sensitive screen, the size of sensor array 116 may be small while still providing for clear fingerprint imaging and other advantages, as discussed in the present disclosure. In an example, the active area of sensor array 116 may be on the order of ¼ inch×½ inch. The touch-sensitive screen (e.g., a capacitive touchscreen) aids effective operation of a reduced size fingerprint sensor. For example, the touch-sensitive screen may detect the position of finger 204 on the surface of the touch-sensitive screen and aid in determining when the finger is positioned over the reduced-size fingerprint sensor.
The user may touch the active fingerprint area on the display. In an example, the user interface (UI) may display an outline of the touch area above the fingerprint sensor and provide a graphical guide for the user to position her finger in the correct location and/or orientation (e.g., see visual image 322 in
As part of the interaction between a reduced-size fingerprint sensor and a display with a touch-sensitive screen, it may be important to know the physical location of sensor array 116 with respect to the display. For example, it may be important to allow an application that is executing on mobile device 102 to know the location of sensor array 116 with respect to the display.
With placement variations and assembly tolerances between various mobile device models, it may be desirable to determine the location of sensor array 116 within each device. Some software applications running on the mobile device may request an authenticating touch input from the user. If the application does not know the location of sensor array 116, the application may be unable to acquire images from sensor array 116. In another example, if the application incorrectly knows the position of sensor array 116 in mobile device 102, the application may not be able to receive the user's touch input and may inadvertently hang or become dysfunctional.
In some embodiments, electromagnetic or electrostatic interactions between sensor array 116 and the touch-sensitive screen may be used to self-calibrate the sensor position and/or orientation after the sensor is attached to the display. For example, a transmitter or receiver electrode associated with sensor array 116 may be biased temporarily with an AC or DC signal to allow detection of the sensor by the touch-sensitive screen. The outline of the active portion of the sensor array may be used to determine the physical placement of the sensor. A software application may be able to run a routine to determine the location of sensor array 116 and to self-calibrate the touch-sensitive screen to the smaller sensor array 116.
In an example, sensor array 116 may be attached to the backside of a display and a touch-sensitive screen (e.g., projected capacitive touchscreen (PCT)) placed over and adhered to the display. To automatically determine the position of sensor array 116 with respect to the touch-sensitive screen, a bias voltage may be applied to one or more of the receiver (e.g., ultrasonic receiver) or transmitter (e.g., ultrasonic transmitter) electrodes. The bias voltage may be applied to the receiver or transmitter electrode closest to the touch-sensitive screen. One or more electrodes of sensor array 116 may be biased or injected with a time-varying signal that can be detected by the overlying capacitive touchscreen to verify aspects of sensor operation (during a sensor self-test procedure).
A scan of the touch-sensitive screen may be performed, and the active region of the sensor determined. Coordinates of the active sensor region may be determined and stored in memory 134 (e.g. areal calibration). The size of the active sensor area may also be stored in memory 134. Accordingly, the size, position, and orientation of sensor array 116 may be determined with respect to a capacitive touchscreen and stored in memory 134.
Software applications running on mobile device 102 may invoke the size, position, and/or orientation parameters to guide the user to a position on the screen where fingerprint images of the user's fingerprint may be captured. A virtual image may provide an example outline to the user of where to place her finger on the touch-sensitive screen (see visual image 322 in
During enrollment using small area fingerprint sensors, multiple touches/taps may be requested for registering each desired finger. This may adversely affect the user's experience (e.g., excessive latency and repetitive touches/taps) and demands excessive computation. For example, the process of requesting that the user tap multiple times to record or register a fingerprint during enrollment may take up to 15 seconds or longer. Additionally, matching or authentication of a user can consume extensive compute resources and cause significant latency depending on the number of enrolled fingers or users. For example, the latency and processing for enrollment may take up to approximately 500 milliseconds for each enrollment image. The processing time grows linearly with the number of fingers and users, thus degrading the user's experience.
To reduce the amount of time to process a fingerprint image, the touch-sensitive screen may be used to detect an approximate finger outline and the region where the user touched.
Sensor array 116 may see only a portion of the user's fingerprints, such as when the sensor array 116 has an active area that is smaller than the fingerprint, or when only a portion of the finger overlaps the active area of the sensor.
As discussed in more detail below, the detected outline may be matched against a template as a first level of authentication and further used for selectively activating high-resolution fingerprint imaging. In some implementations, the outline of one or more finger outlines may be used as a primary or secondary biometric feature in a large-area multi-finger fingerprint sensor. For example, the finger outline result may further trigger a secondary authentication (e.g., ultrasonic fingerprint imaging) and/or biometric enhancement (e.g., liveness detection). Liveness detection detects physiological signs of life. Additionally, the finger outline may be used for enabling localized high resolution and/or insonification of finger positions on a large screen sensor, thus reducing power consumption and processor utilization. Insonification may refer to flooding an area or an object with controlled sound waves, typically as a part of sonar or ultrasound imaging. Accordingly, a multi-level authentication system can be performed with low latency, low processor utilization, and low power consumption.
The position and area of sensor array 116 may be associated with the finger outline to estimate the fingerprint contact area and position of the finger. In some embodiments, the finger outline may be used for template association. By using additional finger outline and/or rotation information from touch system 112, processing fingerprint image data from sensor system 114 may be accelerated (e.g., rotation, position and/or area).
Conventional touch-sensitive screens may image at about 10-20 dots per inch (dpi) whereas fingerprint sensors may image at about 500 dpi. In some embodiments, the touchscreen sensor may be used for determining a finger outline, which may be used to estimate finger rotation and positioning relative to sensor array 116. In an example, the outline and finger rotation/position information may be used for image template or feature template stitching in small-area sensor-based fingerprint enrollment procedures. Stored minutiae and/or fingerprint features from a single finger or a part of the finger may be referred to as a feature template, whereas detailed images of fingerprint ridges, valleys and other features may be referred to as an image template. The fingerprint capture area may be associated with a portion or area of the user's finger, palm or hand, in some cases.
Multiple enrollment images from a single finger may be stitched and/or stored to represent the finger. This representation of the finger may be called an image template. For example, touch system 112 may detect a user's fingertip outline (see
In an example, the touch system may determine an object outline based on detecting the object's touch on a surface of the touch-sensitive screen and determine a rotation of the object from the object outline. In such an example, processing component 132 may identify an image template based on the object outline and rotation, stitch together one or more images of the object with the identified image template, and form a new or updated image template based on the stitching. The image template may be a partial or full image of the object.
In some embodiments, features from the fingerprint images may be extracted and associated feature descriptors may be stored as a representation of the finger. This representation of the finger may be called a feature template. In an example, the touch system may create or determine an object outline based on detecting the object's touch on a surface of the touch-sensitive screen and determine a rotation of the object from the object outline. In such an example, processing component 132 may extract one or more features from the one or more captured images, identify a feature template based on the object outline, rotation and/or extracted features, and stitch one or more images of the object to form or enhance the feature template. Processing component 132 may then create or determine a new feature template based on the stitching. The feature template may be a partial or full image of the object.
The template (e.g., image template or feature template) may be annotated with the finger outline, rotation, and position information to aid in future inquiries. During enrollment, the template(s) of the user's fingers(s) may be stored in a device-secure flash memory (e.g., a secure memory in a phone). In some embodiments, storing the template of the user's finger may be a one-time process. In some embodiments, the template of the user's finger may be updated during inquiries. Additionally, multiple-finger templates of one or more users may be stored in the device. When the user invokes the fingerprint authentication system (e.g., attempts to unlock the device), features of the current inquiry image may be matched with the templates and a match score may be computed. Based on the match score, the user may or may not be authenticated.
Additionally, user feedback regarding the enrollment process may be enhanced with the knowledge of rotation and position, thereby improving the user's experience (e.g., reducing the number of required touches and overall latency) and processor utilization. For example, the touchscreen-detected parameters may be used for enhancing the user's experience by providing useful feedback such as guiding the user and informing the user of progress (see action 226 in
In some embodiments, fingerprint template matching may be performed by matching outlines only. In an example, matching includes a correlation of two outline/silhouette images of an object (e.g., the user's finger, set of fingers, palm or hand). In another example, machine learning may be used to determine if the inquiry outline matches with the enrollment (template) outline. In such an example, the machine learning may be used for localizing templates for fingerprint matching purposes.
Further, the position and rotation for matching may be refined based on estimated parameters during the inquiry. As such, inquiry fingerprint features may be matched against selected or ordered templates. Upon a successful match, an early exit may occur, thus reducing authentication latency and minimizing hardware resource utilization. In large-area sensors, a low-resolution touch sensor may be used for detecting an initial touch and determining an outline of a finger, hand or palm, followed by selective image acquisition and image processing in the regions of interest with a high-resolution sensor array.
Method 1000 includes blocks 1002-1022. In a block 1002, an overlap between sensor array 116 and portions or all of a touch sensor may be detected. In an example, sensor array 116 may be an ultrasonic fingerprint sensor, and the touch sensor may be one or more capacitive sense channels incorporated into a touch-sensitive screen. The fingerprint sensor location may be detected and stored in memory 134. Block 1002 may be performed once per device.
If the fingerprint enrollment is finished, the process flow may proceed to a block 1004, in which the process may end. If the fingerprint enrollment is not finished, the process flow may proceed to a block 1006, in which a finger outline may be detected using the touch sensor. In a block 1008, one or more rotation angles may be determined based on the finger outline. In an example, processing component 132 may analyze the shape and area of the finger outline to determine the one or more rotation angles. The rotation angle may be, for example, an in-plane rotation about a pitch axis, an out-of-plane rotation about a yaw axis and/or rotation about a roll axis.
In a block 1010, the finger outline may be mapped to sensor array 116. In an example, signals from the touch sensor allow identification of coordinates for the finger outline, and processing component 132 may detect the overlap between the finger outline and the active area of sensor array 116. Block 1010 may hold or store in memory fingerprint sensor and touch sensor position information, such as fingerprint sensor coordinates and touch sensor coordinates. From block 1002, the fingerprint sensor position may be determined with respect to the touch sensor coordinates. Accordingly, the touch-derived outline and contact area of the finger may be translated to fingerprint sensor parameters and mapped onto sensor array 116.
In a block 1012, the finger and sensor array contact area may be detected. Processing component 132 may associate the capture area of the finger outline to an area of the finger that is captured in the one or more images (e.g., tip of finger). Processing component 132 may use the coordinates of the finger outline mapped to an area of sensor array 116 to detect the contact area. In a block 1014, a current image (e.g., ultrasonic image) of the finger may be captured. In an example, sensor array 116 may be an ultrasonic sensor array that is fired to acquire one or more ultrasonic images of the finger, and processing component 132 captures the one or more ultrasonic images of the finger.
Blocks 1016 and 1020 have dashed lines, indicating that at most one of these blocks may be executed for each flow from block 1014 to block 1022 of method 1000. If block 1016 is executed then block 1020 is not executed, and the process flow proceeds from block 1016 to block 1018 and then to block 1022. In a block 1016, one or more features of the image are extracted, where the image is a partial enrollment image of the finger. An example of a fingerprint feature is a fingerprint minutia, and examples of image features are edges and corners in the fingerprint image. In an example, features may be described using a histogram of gradients or various statistical parameters of a local block around the image feature. The descriptors may then be matched by a matching algorithm.
In a block 1018, an image template and/or feature template may be stitched together with the current image of the finger. External data such as one or more stored image templates or feature templates may be used for stitching with the current image. One or more images or features of the finger (or other object) may be stitched together with the stored image or feature template, and a new or updated image or feature template may be created or formed based on the stitching.
The conversion from a small area image to a full size image or feature template may be performed in a variety of ways. In one example, small area images may be stitched together using image registration techniques and one or more features of the stitched image may be extracted. In another example, one or more features of partial images may be extracted and the templates annotated and stitched together to create, determine or form another feature or image template. In another example, the captured fingerprint image may be annotated with the position and rotation angle information. The position and rotation angle information may be used for stitching the image or labeling/stitching the image template. Additionally, the finger outline, position, and area information may be tagged to the templates to aid in fast matching/inquiry.
In some implementations, the feature or image templates may not be stitched together. Rather, the templates may be ordered or otherwise numbered and stored for future inquiry, matching or authentication purposes. In some implementations, the touch sensor or touch-sensitive screen may aid when stitching templates, based on the known overlap area with respect to the fingerprint sensor. For example, a middle section of a flat finger (see
From block 1018, process flow proceeds to block 1022, where guidance to perform enrollment position/rotation change may be provided to the user (see action 226 in
In contrast, if block 1020 is executed then block 1016 is not executed and process flow proceeds from block 1014 to blocks 1018-1022. In such an example, from block 1018, process flow proceeds to block 1020. In a block 1020, one or more features of the image may be extracted, where the image is a full enrollment image of the finger. The one or more features and/or images may be stored in memory. From block 1020, process flow proceeds to block 1022.
It is also understood that additional processes may be performed before, during, or after blocks 1002-1022 discussed above. It is also understood that one or more of the blocks of method 1000 described herein may be omitted, combined, or performed in a different sequence as desired. In an embodiment, method 1000 may be performed for any number of objects hovering over or in contact with a surface of the touch-sensitive screen (e.g., multiple fingers).
Method 1100 includes blocks 1102-1120. In a block 1102, an overlap between a sensor array 116 and portions or all of a touch sensor such as a touchscreen may be detected. In an example, sensor array 116 may be an ultrasonic fingerprint sensor, and the touch sensor may be one of one or more capacitive touch sensors, capacitive touch buttons, or capacitive sense channels incorporated into a touch-sensitive screen. The fingerprint sensor location may be detected and stored in memory 134. In some implementations, x-y coordinates of each corner associated with the perimeter of the active area of sensor array 116 with respect to the touch sensor may be detected and stored in memory 134. Block 1102 may be performed once per device (e.g., for each sensor array 116 and/or for each touch sensor (i.e., the capacitive touchscreen and each capacitive touch button) in a mobile device 102).
From block 1102, the process flow may proceed to a block 1104, where an outline of a finger may be detected using the touch sensor. From block 1104, the process flow may proceed to a block 1106 and a block 1118, described further below. In a block 1106, the finger outline may be matched against one or more stored finger outline templates (e.g., an image template or feature template associated with the finger outline). In an example, processing component 132 may attempt to match the finger outline against one or more finger outline templates obtained during a prior enrollment process. In a block 1108, it may be determined whether the finger outline matches one or more of the stored finger outline templates. In an example, the finger outline may be matched with a registered fingerprint database that may include registered finger outlines. The finger outline templates corresponding to the matching finger outlines may be selected for further analysis, such as fingerprint analysis of an acquired fingerprint image from the user.
If the finger outline is determined to not match any of the finger outline templates, the process flow may proceed to a block 1120, in which the process may end. If the finger outline is determined to match with one or more finger template outlines, the process flow may proceed to block 1110 and 1112. In a block 1110, finger rotations may be detected from the finger outline detected in block 1104. From block 1110, the process flow may proceed to a block 1116, in which the finger and sensor array contact area may be detected. From block 1110, the process flow may also proceed to block 1118, which is described further below.
From blocks 1116 and 1112, the process flow may proceed to a block 1118, in which features such as finger rotation, relative position of one or more fingers, contact area of the finger, and/or a finger outline are matched to a corresponding outline template (see block 1106). The finger position, rotation, and sensor array contact area may be estimated from the finger outline and the sensor array's relative position to the finger outline. Using the estimated rotation, position, contact area, and/or finger identifier, the features, obtained from the inquiry finger image, may be matched against the corresponding outline template. As a fallback, other templates may be searched if the outline-based templates that have been preliminarily identified fail to find a match. Template information for one or more fingers of a single user or of multiple users may be stored in memory of a mobile device. Additionally, each finger may have subparts captured by the fingerprint sensor. Furthermore, the templates may be prioritized for search/matching based on an outline matching score to reduce latency.
In some implementations, finger identification may be determined based on the finger outline or the finger area. Finger identification may include which finger of a hand is being asserted, the relative position between the fingers, and/or finger area (e.g., size of a finger or the contact area between various joints of the finger or hand). Finger identification may help to narrow fingerprint searching and matching. Alternatively, the finger outline may help identify or initially select which fingerprint image or feature templates to search. For example, the finger outline may be used to determine an offset angle between inquiry and enrollment images to aid in searching and matching. In some implementations, the finger outline or area may allow low-level verification. From block 1118, the process flow may proceed to block 1120, in which the process ends.
It is also understood that additional processes may be performed before, during, or after blocks 1102-1120 discussed above. It is also understood that one or more of the blocks of method 1100 described herein may be omitted, combined, or performed in a different sequence as desired. In an embodiment, method 1100 may be performed for any number of objects hovering over or in contact with a surface of the touch-sensitive screen (e.g., multiple fingers).
Although object 104 in
Referring again to
In response to determining that the object is a stylus, sensor system 114 may recognize that it is detecting and being touched by a stylus and reconfigure the touch-sensitive screen for optimal sensing of the object. For example, main PCB 320 or a controller associated with the touchscreen may increase the sample rate, resulting in a higher dynamic resolution on the screen. That is, an increased sampling rate allows faster detection and response to movements of an object such as a stylus on a surface of the touchscreen, increasing the speed at which the touch system can follow a rapidly moving stylus, finger or other object on the touchscreen.
The user may touch a portion of the touch-sensitive screen that overlaps with sensor array 116. In an example, a user may touch the overlapping portion (the portion of the touch-sensitive screen overlapping with sensor array 116) with a tip of a stylus, an image may be obtained of the stylus tip, the sensor system 114 may determine that the object is a stylus, and the touch-sensitive screen may be reconfigured to accommodate the stylus based on the stylus determination. For example, the sample rate, gain, touch thresholds, and filter settings associated with a stylus mode of the particular tip may be applied to the touchscreen via the touch system 112. The sample rate for the touchscreen may be increased by more rapidly accessing the various rows and columns of the touch-sensitive screen, allowing faster acquisition of data and the ability to track a quickly moving stylus across the surface of the touchscreen.
Alternatively, a limited portion of the rows and columns associated with the touchscreen may be accessed, allowing an increased frame rate in an area of interest (e.g., in the vicinity of the stylus tip). The gain associated with one or more channels of the touchscreen may be increased when the detected object is determined to be a stylus, as the area (and signal) associated with the tip of a stylus is generally much smaller than the area (and signal) of a finger touching the touchscreen. For example, an amplification factor may be increased for corresponding capacitive sense channels when attempting to detect the presence of a small-area object on or near the surface of the touchscreen.
Alternatively, a threshold for detecting an object may be lowered when the detection of a stylus tip is anticipated, compared to the threshold for detecting a larger object such as a finger, since the sensed signal for a small object is generally smaller than the sensed signal for a large object. Various filter settings (e.g., electronic filters or image-processing filters) may be adjusted to accommodate the detection of a stylus tip, such as a software filter that recognizes a small-area object. A low-pass spatial filter may be used, for example, when detecting the presence of a finger to reduce or eliminate the higher spatial frequencies associated with dust, small scratches or debris on the surface of the touchscreen. Allowance for increasing the roll-off frequency of the low-pass filter to allow detection of the spatial frequencies associated with a stylus tip may be incorporated into the touch system 112.
Alternatively, a band-pass filter centered in a region of interest around the approximate spatial frequency of a stylus tip may be incorporated into the touch system 112. Similarly, a high-pass filter that passes the spatial frequencies associated with a stylus tip rather than the lower spatial frequencies associated with a finger may be incorporated into the touch system 112. The sample rate, gain, touch thresholds, and filter settings associated with a stylus mode may be further adjusted to accommodate a particular style of stylus tip.
In another example, the user may touch the overlapping portion with a blunt tip of a stylus. In such an example, the touch-sensitive screen may be reconfigured for the sample rate, gain, touch thresholds, and filter settings associated with a “blunt tip” stylus mode after detection of the stylus by the sensor array 116 and determination of the stylus characteristics by sensor system 114. A blunt tip may exemplify, for example, a larger marker tip, a soft or compliable marker tip, or an angled rectangular marker tip. In another example, the user may touch the overlapping portion with a fine tip of a stylus. In such an example, the touch-sensitive screen may be reconfigured for the sample rate, gain, touch thresholds, and filter settings associated with a “fine tip” stylus mode after detection of the stylus by the sensor array 116 and determination of the stylus characteristics by sensor system 114. A fine tip may exemplify, for example, a smaller marker tip or a small-radius tip of a ball-point pen or pencil.
In another example, a user may touch an overlapping portion of the touchscreen and sensor array with an object such as an acoustic information tag. The acoustic information tag may contain an acoustic signature or other acoustic identifier. For example, the acoustic tag may contain an acoustic version of a one-dimensional or two-dimensional barcode, such as a UPC bar code, a QR code, or other information-carrying code. Alternatively, the acoustic information tag may contain an acoustic identifier such as a personalized insignia, signature, mark, emblem or tattoo. For example, a set of detents or raised surfaces (e.g. embossments) on the acoustic tag may be detected with an underlying ultrasonic sensor. The raised regions may pass or transmit more acoustic energy when in contact with the surface of the touchscreen, cover glass, cover lens or platen overlying the ultrasonic sensor relative to intervening regions of air or other acoustic-mismatched material disposed between the raised regions. The acoustic information tag may be recognized by the mobile device 102. The tag may be detected by the touchscreen and then imaged by an underlying ultrasonic sensor array 116. The acoustic tag may enable an action to occur (e.g., providing a coupon, delivering an advertisement, tracking a piece of equipment, identify a person, etc.). In such an example, processing component 132 may identify the acoustic information tag or acoustic identifier and cause the action to occur.
Additionally, mobile device 102 may have one or more touch buttons that are not located on or part of the touch-sensitive screen that is above an active area of a visual display. In an example, a touch button may be a capacitive-sense touch button including a capacitive electrode that is mounted or positioned outside the periphery of the active display area. In some embodiments, sensor array 116 may be located underneath one or more of the peripheral touch buttons. The touch button may be, for example, a home, menu or back button that is positioned at the bottom of mobile device 102. Processing component 132, which interacts with sensor system 114, may also manage the touch buttons such that the touch button feeds data into sensor array 116. For example, a capacitive touch-screen button with an underlying ultrasonic sensor array 116 may use data from the capacitive touch-screen button to determine when an object such as a finger is sufficiently over the sensor array 116, then activate the sensor array 116 to acquire one or more images of the finger. In some implementations, the sensor array 116 may not be positioned directly underneath an active part of the display, yet may be peripheral to the display while still sharing a common cover glass.
When the user touches the touch button, the user may also be touching (or hovering over) sensor array 116, which is in close proximity to the touch button. In such an example, sensor array 116 may be fired to acquire one or more ultrasonic images of the object. For example, the touch button may perform action 210 (in
Method 1200 includes blocks 1202-1210. In a block 1202, an outline of an object may be detected by a touch sensor. In an example, the object is a finger of a user, and touch system 112 detects an outline of the user's finger on the surface of the touch-sensitive screen. Mobile device 701 may include a low-power touchscreen sensor that is used to detect an initial touch and finger outline.
In a block 1204, the outline of the object may be authenticated. In an example, output from a low- or intermediate-resolution capacitive sensor in the touch-sensitive screen may be used to authenticate the outline of the object. In an example, the object may be a hand, finger or palm, and the low- or intermediate-resolution authentication may use a 10-50 dpi touch-sensitive screen to authenticate the outline. If the outline of the object fails authentication, the process flow may proceed to a block 1210 in which the process ends. If the outline of the object is authenticated, the process flow may proceed to a block 1206, where an area and position of the object may be detected. In another embodiment, authentication is equivalent to detection, such that touch system 112 is detecting for a finger without any specificity as to the identity of the user. In this embodiment, any object matching a profile (e.g., shape, aspect ratio, etc.) for a finger will be authenticated.
The single- or multi-finger/hand outline may be used for triggering a second authentication level using more detailed features of the object. For example, a high-level authentication effort may be attempted only after an outline of a user's finger, palm or hand has passed an initial low-level authentication effort based on signals from the touchscreen to avoid the power, time and computing resources generally needed for high-level authentication. Alternatively, a low-resolution outline of a finger, palm or hand may provide approximate location on the touchscreen of the finger, palm or hand for further high-resolution detection. In an example, the touch-sensitive screen may detect the area and position of the multiple fingers simultaneously touching the surface of the touchscreen, as shown by imaging regions 702-710 in
From block 1206, the process flow may proceed to a block 1208, where a high-resolution image capture is enabled on one or more selected object positions and areas on the touchscreen are insonified. For example, the finger outline may be used for selective insonification (e.g., emit/capture ultrasonic fingerprint image on selective areas) corresponding to imaging regions 702-710 in
Accordingly, a two-level fingerprint authentication system may authenticate at a first authentication level via a low- or intermediate-resolution capacitive sensing and authenticate at a second level via high-resolution ultrasonic fingerprint sensing. The first authentication level may use the finger outline to authenticate the fingerprint at a low level. The second authentication level may be “woken up” based on whether the fingerprint passed the first authentication level. In an example, the second authentication level is triggered only if the first authentication level requires the enablement of high-resolution ultrasound-based liveness and/or fingerprint verification. Although ultrasonic technology has been used as an example, it should be understood that other technologies are within the scope of the disclosure (e.g., capacitive, optical, RF, IR, or FSR technologies) for liveness detection and fingerprint imaging.
It is also understood that additional processes may be performed before, during, or after blocks 1202-1210 as discussed above. It is also understood that one or more of the blocks of method 1200 described herein may be omitted, combined, or performed in a different sequence as desired. In some embodiments, method 1200 may be performed for any number of objects hovering over or touching the touch-sensitive screen (e.g., multiple fingers). In some embodiments, the second authentication level may be used as a backup to the first authentication level. For example, if the first authentication level fails to acquire an adequate sensor image of the object, the second authentication level may be used to acquire a more detailed sensor image of the object and authenticate the user.
In some embodiments, the second authentication level may be used to detect liveness using sub-surface imaging (e.g., from ultrasonic or IR waves). While sensors (e.g., ultrasonic fingerprint sensors) can be effective in verifying and validating a user, spoofing such a system with artificially created fingers or fingerprint patterns remains a concern. An ultrasonic probe may be positioned on the surface of an ultrasonic fingerprint-enabled display to detect firings of an ultrasonic transmitter associated with an ultrasonic sensor array as a finger is placed on the display surface, while a capacitance probe may be used to determine stimulus of electrodes that may be used for liveness detection. The “liveness” of the finger may be detected using the touch-sensitive screen by, for example, recognizing capacitance variations with the perfusion of blood on the tip of the finger.
As discussed, a PCT touchscreen positioned above a display may be coupled to a sensor array (e.g., ultrasonic sensor array) that is positioned below a portion of the display. In some embodiments, select electrodes (e.g., capacitive electrodes) on, incorporated into, or otherwise included with the PCT touchscreen or the ultrasonic sensor array may be stimulated to detect changes in permittivity (er(t)) with respect to time of a finger positioned on the display above the fingerprint sensor that can detect heart rate or liveness of the finger. The “er” may be referred to as relative permittivity and is normalized to free space permittivity, which may be referred to as “e0.” The electrodes may be used to detect heart rate or liveness by injecting a signal into one or more of the selected electrodes. As blood flows through the body, the amount of blood with respect to other biological tissues varies, going up and down and changing the electrical characteristics (e.g., electrical impedance) of the body portion in contact with the touchscreen.
As permittivity changes, one or more signals may be injected into select electrodes that are part of the touch-sensitive screen (e.g., column or row electrodes) or sensor array in order to detect the slight changes in capacitance that occur with the pulsing of blood into the finger. For example, small electrical signals in the tens of kilohertz to tens of megahertz range may be injected into one of the electrodes, and the corresponding signal detected at another of the electrodes. The capacitive coupling between the first and second electrodes may be determined in part by the permittivity of the object in proximity to or touching the electrodes. The effective permittivity may vary with the proportion of blood in the finger region and may be correlated with blood pressure pulses. The capacitance is typically proportional to the effective permittivity, which may correlate to the amount of blood in the finger at a particular point in time. As the perfusion of larger and smaller amounts of blood with the user's beating heart changes the effective permittivity of the finger, capturing and filtering time-domain signals from the selected electrodes allows determination of liveness by detecting the beating heart. The liveness detection method may be applied before, after, or while acquiring fingerprint image data (e.g., ultrasonic fingerprint image data), to add an important component of liveness to user validation and reduce the ability to spoof the authentication system.
A capacitance against time signal 1304 represents the ventricular pressure of a user against time (arbitrary units with approximately 1-2 Hz heart rate). Although ventricular pressure is illustrated, it should be understood that this is illustrative and not intended to be limiting and other types of biological pressure may be detected and used to determine liveness. For example, atrial and/or aortic pressure may be used. Additionally, the user's pulse may be detected.
The user may interact with capacitive pulse-detection electrodes associated with the sensor array (e.g., on the touchscreen, sensor array, or periphery of the touchscreen or sensor array). Referring back to
The capacitance against time signal 1304 may be filtered to extract liveness information. The capacitance against time signal 1304 may be determined using one or more liveness signal injection frequencies (e.g. in a range between about 10 kHz and 100 MHz), to detect changes in permittivity or capacitance with pulse as a function of excitation frequency (i.e. effective impedance as a function of frequency). To extract liveness information, the capacitance against time signal 1304 may be pre-filtered and Fast Fourier Transform (FFT) analysis may be performed on the capacitance against time signal 1304. For example, the liveness detection signal may be filtered with a low-pass filter to filter out high-frequency noise from normal operation of the touchscreen or to filter out the injected liveness detection signal. The FFT may reveal the content of the liveness signal in the approximately 1-2 Hz range, indicative of a human heartbeat. Lack of a signal above a liveness detection threshold in the approximately 1-2 Hz range may indicate that the object being detected and/or authenticated is not live. To ensure proper liveness detection, it may be desirable that the user's finger be resident at the same location on the touch-sensitive screen for about 1-2 pulses of the heart.
In some embodiments, particular electrodes of a PCT touchscreen may be used to detect heart rate pulses, and an underlying sensor system 114 (e.g., ultrasonic biometric sensor) may be used to acquire fingerprint image information. The fingerprint image information may be used to identify or otherwise verify the user, and the heartrate information may be used to ascertain liveness of the user and diminish spoofing. In an example, sensor system 114 is an ultrasonic fingerprint system, and the user may place a finger 204 on a touchscreen above the ultrasonic fingerprint sensor. In response to the touch screen detecting the placement of the finger, the ultrasonic sensor may be fired to acquire an ultrasonic fingerprint image, and particular electrodes of the PCT touchscreen or the sensor array may be excited and sampled to acquire pulse information associated with the finger.
At an action 1420, the fingerprint control block may receive an output of actions 212, 1410, and/or 214, and object detection system 110 may process the inputs. In an example, sensor array 116 may fire one or more transmitters, and processing component 132 may acquire fingerprint image data when finger 204 is located above sensor array 116. The processing component 132 may monitor the presence of the finger while pulse information is acquired, to insure that no substantive movement of the finger has occurred and that the finger remains in contact with the surface of the touchscreen. At an action 1424, acquisition timing may be determined and used as an input into sensor system 114. Additionally, at an action 1426, visual, audio, and haptic feedback may be provided to the user.
At an action 1430, the touch-sensitive screen may perform actions such as determining when finger 204 is located above sensor array 116, enabling select PCT electrodes to detect pulse information associated with finger 204, acquiring pulse information associated with finger 204, and providing pulse information to processing component 132 for combining with the fingerprint image data and generating a liveness output signal (e.g., capacitance against time signal 1304 in
In some embodiments, the electrodes used to detect the user's pulse may be arranged in a column and row structure. In some implementations, one, some or all of the rows and columns of a PCT touchscreen may serve additionally as electrodes for detecting the user's pulse. In some implementations, dedicated pulse-detection electrodes may be included with the touch-sensitive screen or with a fingerprint sensor array 116 of the sensor system 114. As discussed, signals may be injected into a particular row and column of the structure to stimulate one or more electrodes for liveness detection. If the same electrodes are being used to scan the user's finger for acquisition of an image of the finger, it may be desirable to not interfere with the scanning operation of the particular row(s) and column(s) of the structure. In one example, to overcome this interference, operation of the scanning may be suspended during the liveness detection and resumed after the liveness detection. In another example, the injected liveness detection signal may be capacitively coupled to select row(s) and column(s) of the structure at a frequency or frequencies removed from normal PCT touchscreen operation. In another example, selective filtering may be performed to filter out whether a pulse is detected in a particular set of interacting or overlapping electrodes or whether an object is detected to determine whether an object is above or touching the touch-sensitive screen.
In some embodiments, capacitive electrodes at the perimeter of sensor array 114 may be used to detect heart rate or liveness while the sensor array acquires a fingerprint image. The fingerprint image information may be used to identify or verify the user, and the heart-rate information may be used to determine liveness of the user to diminish spoofing. The capacitive electrodes included with the fingerprint sensor (e.g., not electrodes associated with an overlying touch-sensitive screen) may allow permittivity or impedance variations with heart rate to be detected while acquiring fingerprint information.
At an action 1505, fingerprint data and capacitive data may be captured and processed. At an action 1520, the fingerprint control block may receive an output of actions 1505 and/or 214, and object detection system 110 may process the inputs. In an example, capacitance electrodes 1550 and 1552 may detect the placement of the finger capacitively, sensor array 116 (not shown in
Capacitance electrodes 1550 and 1552 may detect the pulse associated with finger 204 capacitively, and processing component 132 may acquire the pulse information. Processing component 132 may process the fingerprint image data and pulse information to identify or verify the user and to generate a liveness output signal. Additionally, processing component 132 may monitor the finger presence ultrasonically while the pulse information is being acquired via capacitance electrodes 1550 and 1552. Alternatively, processing component 132 may monitor the finger presence via capacitive electrodes 1550 and 1552 while fingerprint image data is being acquired ultrasonically.
In some embodiments, segmented bias electrodes on the upper layers of the sensor array may be used to detect heart rate or liveness while processing component 132 acquires a fingerprint image. The fingerprint image information may be used to identify or verify a user, and the heart-rate information may be used to determine liveness. For example, segmented bias electrodes on the upper layers of an ultrasonic sensor array 116 may be used to detect heart rate and/or liveness while the sensor array 116 acquires a fingerprint image ultrasonically.
A computing device may run platform 1800, which may include a user interface 1802 that is in communication with a control unit 1804, e.g., control unit may 1804 accept data from and controls user interface 1802. User interface 1802 may include display 304, which includes a means for displaying graphics, text, and images, such as an LCD or OLED display.
User interface 1802 may further include a keypad 1810 or other input device through which the user can input information into platform 1800. If desired, keypad 1810 may be obviated by integrating a virtual keypad into display 304. It should be understood that with some configurations platform 1800 or portions of user interface 1802 may be physically separated from control unit 1804 and connected to control unit 1804 via cables or wirelessly, for example, in a Bluetooth headset. Touch sensor 1812 may be used as part of user interface 1802 by detecting an object that is touching a surface of the touch-sensitive screen. Touch sensor 1812 may be, for example, a capacitive touch sensor such as a PCT touchscreen or dedicated capacitive electrodes on a portion of the touchscreen or a sensor array 116.
Object detection system 110 may detect an object and capture one or more ultrasonic images of the object. Control unit 1804 may accept and process data from user interface 1802, touch sensor 1812, and sensor array 116. Platform 1800 may include means for detecting signals (e.g., ultrasonic, optical or IR signals) reflected from an object with respect to a touch-sensitive screen of a device. Platform 1800 may further include means for capturing one or more sensor images of the object based on the reflected signals. When an object is located above the means for capturing the one or more images, the object may be located above at least a portion of the touch-sensitive screen.
Control unit 1804 may include one or more processors 1820 and associated memory 1822, hardware 1824, software 1826, and firmware 1828. Control unit 1804 may include means for controlling object detection system 110. Components of object detection system 110 may be included in processor 1820, memory 1822, hardware 1824, firmware 1828, or software 1826, e.g., computer readable media stored in memory 1822 (e.g., methods 600, 1000, 1100, and 1200) and executed by processor 1820, or a combination thereof. Processor 1820 may correspond to processing component 132 and execute instructions to capture one or more sensor images of objects. In an example, processing component 132 may capture ultrasonic images of objects.
It will also be understood as used herein that processor 1820 can, but need not necessarily include, one or more microprocessors, embedded processors, controllers, application specific integrated circuits (ASICs), digital signal processors (DSPs), graphics processing units (GPUs), and the like. The term processor is intended to describe the functions implemented by the system rather than specific hardware. Moreover, as used herein the term “memory” refers to any type of computer storage medium, including long term, short term, or other memory associated with the platform, and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
The methodologies described herein may be implemented by various means depending upon the application. For example, these methodologies may be implemented in hardware 1824, software 1826, firmware 1828, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, software codes may be stored in memory 1822 and executed by processor 1820. Memory may be implemented within the processor unit or external to the processor unit.
For example, software 1826 may include program code stored in memory 1822 and executed by processor 1820 and may be used to run the processor and to control the operation of platform 1800 as described herein. Program code stored in a computer-readable medium, such as memory 1822, may include program code to detect, by a sensor array coupled to a touch-sensitive screen of a device, signals reflected from an object with respect to the touch-sensitive screen and to capture, based on the reflected signals, one or more images of the object, where at least a portion of the ultrasonic sensor array overlaps with at least a portion of the touch-sensitive screen. The program code stored in a computer-readable medium may additionally include program code to cause the processor to control any operation of platform 1800 as described further below.
If implemented in firmware and/or software, the functions may be stored as one or more instructions or code on a computer-readable medium. Examples include computer-readable media encoded with a data structure and computer-readable media encoded with a computer program. Computer-readable media includes physical computer storage media. A storage medium may be any available medium that can be accessed by a computer. By way of example and not limitation, such computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer; disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
One skilled in the art may readily devise other systems consistent with the disclosed embodiments which are intended to be within the scope of this disclosure. The foregoing disclosure is not intended to limit the present disclosure to the precise forms or particular fields of use disclosed. As such, it is contemplated that various alternate embodiments and/or modifications to the present disclosure, whether explicitly described or implied herein, are possible in light of the disclosure. Changes may be made in form and detail without departing from the scope of the present disclosure. Thus, the present disclosure is limited only by the claims.