This disclosure relates generally to authentication devices and methods, particularly authentication devices and methods applicable to mobile devices.
As mobile devices become more versatile, user authentication becomes increasingly important. Increasing amounts of personal information may be stored on and/or accessible by a mobile device. Moreover, mobile devices are increasingly being used to make purchases and perform other commercial transactions. Existing authentication methods typically involve the use of a password or passcode, which may be forgotten by a rightful user or used by an unauthorized person. Improved authentication methods would be desirable.
The systems, methods and devices of the disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.
One innovative aspect of the subject matter described in this disclosure can be implemented in a method that involves presenting an image on a display device indicating an area for a user to touch and obtaining partial fingerprint data from at least a partial finger touch in the area. The finger touch may, for example, involve left-thumb-side touching, right-thumb-side touching, or fingertip touching. The partial fingerprint data may correspond to a touching portion of a finger or a thumb. The method may involve comparing the partial fingerprint data with master fingerprint data of a rightful user and determining, based at least in part on the comparing process, whether to invoke a function. Invoking the function may involve authorizing a transaction, starting a personalized application or unlocking the display device. In some examples, the determination of whether to invoke the function may involve determining whether to authorize a transaction based on a level of security.
In some implementations, the partial fingerprint data may include known fingerprint data of the current master fingerprint data and new fingerprint data. The method may involve updating the master fingerprint data to include the new fingerprint data. The updating process may involve augmenting the master fingerprint data and/or adapting the master fingerprint data.
The method may involve determining finger tap characteristic data of the rightful user. Determining whether to invoke the function may be based, at least in part, on comparing finger tap characteristic data of a current user with finger tap characteristic data of the rightful user. In some implementations, the finger tap characteristic may correspond with a number of taps, a frequency of taps, a sequence of taps and/or an auditory signature.
The process of obtaining partial fingerprint data may involve an ultrasonic imaging process. In some such implementations, the process of obtaining partial fingerprint data may involve obtaining the partial fingerprint data via an ultrasonic sensor array while maintaining an ultrasonic transmitter in an “off” state.
In some implementations, the method may involve receiving device movement data. The determining process may be based, at least in part, on the device movement data.
The indicated area for the user to touch may differ according to the implementation. In some examples, the area for the user to touch may be within a display area, outside the display area or on a back of the display device. In some implementations, the area for the user to touch may overlap at least a portion of a fingerprint acquisition system.
In some implementations, the method may involve prompting the user to provide substantially complete fingerprint data for at least one finger. The method may involve associating the substantially complete fingerprint data with the rightful user and storing the substantially complete fingerprint data in a memory.
In some examples, the method may involve presenting one or more purchasing icons on the display device. The purchasing icons may, for example, correspond to purchasable items. The method may involve moving a representation of one of the purchasing icons onto the indicated area in response to a corresponding dragging movement of the touching portion of the finger or thumb. The method may involve determining whether to authorize a transaction.
In some implementations, the method may involve presenting one or more application icons on the display device. Each of the application icons may correspond to a software application. The method may involve moving a representation of one of the application icons onto the indicated area in response to a corresponding dragging movement of the touching portion of the finger or thumb. The method may involve determining whether to start the corresponding application.
Other innovative aspects of the subject matter described in this disclosure can be implemented in a method that involves presenting an image on a display device indicating an area for a user to touch in order to make a commercial transaction. The method may involve determining a level of security may correspond to the commercial transaction. The method also may involve obtaining partial fingerprint data from at least a partial finger touch in the area. The partial fingerprint data may correspond to a touching portion of a finger or a thumb. The method also may involve comparing the partial fingerprint data with master fingerprint data of a rightful user and determining, based at least in part on the comparing process and the level of security, whether to authorize the commercial transaction.
The level of security may be based on one or more of a requested payment amount, an amount of available credit, an amount of money to be transferred between accounts, a type of merchandise or the user's credit score. In some examples, the method may involve determining that the level of security indicates that additional data will be required in order to determine whether to authorize the commercial transaction. The additional data may include full fingerprint data for at least one finger, a finger tap characteristic and/or device movement data. The finger tap characteristic may correspond with a number of taps, a frequency of taps, a sequence of taps and/or an auditory signature.
Some or all of the methods described herein may be performed by one or more devices according to instructions (e.g., software) stored on non-transitory media. Such non-transitory media may include memory devices such as those described herein, including but not limited to random access memory (RAM) devices, read-only memory (ROM) devices, etc. Accordingly, other innovative aspects of the subject matter described in this disclosure can be implemented in a non-transitory medium having software stored thereon. For example, the software may include instructions for controlling at least one apparatus to present an image indicating an area for a user to touch and obtain partial fingerprint data from at least a partial finger touch in the area. The partial fingerprint data may correspond to a touching portion of a finger or a thumb. the software may include instructions for controlling at least one apparatus to compare the partial fingerprint data with master fingerprint data of a rightful user and to determine, based at least in part on the comparing process, whether to invoke a function.
The function may involve authorizing a transaction, starting a personalized application, or unlocking the display device. The partial fingerprint data may include known fingerprint data of the current master fingerprint data and new fingerprint data. The software may include instructions for controlling at least one apparatus to update the master fingerprint data to include the new fingerprint data. The updating may involve at least one of augmenting the master fingerprint data or adapting the master fingerprint data. The obtaining may involve an ultrasonic imaging process.
The software may include instructions for controlling at least one apparatus to present one or more purchasing icons on the display device. The purchasing icons may correspond to purchasable items. The software may include instructions for controlling at least one apparatus to move a representation of one of the purchasing icons onto the indicated area in response to a corresponding dragging movement of the touching portion of the finger or thumb and to determine whether to authorize a transaction.
In some examples, the software may include instructions for controlling at least one apparatus to present one or more application icons on the display device. Each of the application icons may correspond to a software application. The software may include instructions for controlling at least one apparatus to move a representation of one of the application icons onto the indicated area in response to a corresponding dragging movement of the touching portion of the finger or thumb and to determine whether to start the corresponding application.
Other innovative aspects of the subject matter described in this disclosure can be implemented in an apparatus that may include a display, a fingerprint acquisition system and a control system. The control system may be capable of controlling the display to present an image indicating an area for a user to touch; controlling the fingerprint acquisition system to obtain partial fingerprint data from at least a partial finger touch in the area, the partial fingerprint data may correspond to a touching portion of a finger or a thumb; comparing the partial fingerprint data with master fingerprint data of a rightful user; and determining, based at least in part on the comparing process, whether to invoke a function.
The apparatus may include a motion sensor system capable of sensing device movement and providing device movement data to the control system. The control system may be capable of determining whether the device movement data corresponds with device movement data of the rightful user.
In some implementations, the apparatus may include a finger tap sensing system. The control system may be capable of receiving, from the finger tap sensing system, information regarding one or more finger taps and of determining a finger tap characteristic data based on the information regarding one or more finger taps. Determining whether to invoke the function may be based, at least in part, on comparing the finger tap characteristic data with finger tap characteristic data of the rightful user. The finger tap characteristic data may correspond with a number of taps, a frequency of taps, a sequence of taps and/or an auditory signature.
In some examples, the fingerprint acquisition system may include an ultrasonic imaging system. According to some such implementations, the ultrasonic imaging system may include an ultrasonic sensor array and an ultrasonic transmitter. In some examples, the obtaining process may involve obtaining the partial fingerprint data via the ultrasonic sensor array while maintaining the ultrasonic transmitter in an “off” state. In some implementations, the fingerprint acquisition system may be positioned within a display area. However, in alternative implementations the fingerprint acquisition system may be positioned, at least in part, outside the display area. For example, the fingerprint acquisition system may be positioned on the periphery of the display area, on a side of the apparatus, on the back of the apparatus, etc.
Other innovative aspects of the subject matter described in this disclosure can be implemented in a method that may involve presenting an image on a display device indicating an area for a user to touch. The image may correspond to an icon associated with a first software application. The method may involve obtaining partial fingerprint data from at least a partial finger touch in the area. The partial fingerprint data may correspond to a touching portion of a finger or a thumb.
The method may involve comparing the partial fingerprint data with master fingerprint data of a rightful user. The master fingerprint data may, for example, correspond to a second software application relating to authentication functionality. The method may involve determining, based at least in part on the comparing process, whether to update the master fingerprint data to include the new fingerprint data. In some examples, the first software application does not relate to authentication functionality. In some implementations, the updating may involve augmenting the master fingerprint data and/or adapting the master fingerprint data.
The method may involve obtaining new finger tap characteristic data of the rightful user. The determining process may involve determining whether to update existing finger tap characteristic data of the rightful user according to the new finger tap characteristic data. In some examples, the finger tap characteristic may correspond with a number of taps, a frequency of taps, a sequence of taps and/or an auditory signature.
In some implementations, the method may involve receiving new device movement data of the rightful user. The determining process may involve determining whether to update existing device movement data of the rightful user according to the new device movement data.
Still other innovative aspects of the subject matter described in this disclosure can be implemented in a method that may involve presenting one or more icons on a display device to a user and receiving an indication that a digit of the user is touching an area of the display device may correspond to one of the presented icons. The method may involve moving a representation of one of the presented icons onto an area indicating a selection of the icon, in response to a corresponding dragging movement of the digit, acquiring biometric information from the digit when the digit is positioned in a fingerprinting sensing area and invoking a function based on the acquired biometric information.
The acquiring process may involve obtaining partial fingerprint data from the digit. Invoking the function may involve authorizing a transaction, starting an application or unlocking the display device.
Yet other innovative aspects of the subject matter described in this disclosure can be implemented in a method that may involve presenting an image on a display device indicating an area for a user to touch and obtaining partial fingerprint data from at least a partial finger touch in the area. The partial fingerprint data may correspond to a touching portion of a finger or a thumb. The method may involve performing an authentication process based, at least in part, on the partial fingerprint data.
In some implementations, the method may involve determining, based at least in part on the authentication process, whether to invoke a function. For example, invoking the function may involve authorizing a transaction, starting a personalized application, or unlocking the display device.
Further innovative aspects of the subject matter described in this disclosure can be implemented in a method that may involve presenting one or more icons on a display and receiving an indication that a user is interacting with at least one of the icons presented. The method may involve acquiring biometric information from a digit, during the user interaction with the icon, when the digit is positioned in a fingerprinting sensing area. The method may involve invoking a function based, at least in part, on the acquired biometric information.
In some examples, the acquiring process may involve obtaining partial fingerprint data from the digit. Receiving the indication that the user is interacting with an icon may involve receiving an indication that the digit is touching an area of the display device that corresponds to one of the presented icons. Alternatively, or additionally, receiving the indication may involve receiving an indication of a dragging motion of the digit towards an indicated area. The indicated area may, for example, be displayed on the display. However, in some examples the indicated area may be an edge of the display, a side of a display device or a back of the display device. For example, the display may be on a front side of the display device and the fingerprint sensing area may be on a side of the display device, on the back of the display device, etc.
In some implementations, receiving the indication that the user is interacting with an icon presented may involve receiving an indication that the user has tapped on the icon a number of times and/or within a range of time intervals. In some examples, acquiring the biometric information may involve an ultrasonic imaging process.
Other innovative aspects of the subject matter described in this disclosure can be implemented in an apparatus that may include a display, a fingerprint acquisition system and a control system. The control system may be capable of controlling the display to present an image indicating an area for a user to touch in order to make a commercial transaction, of determining a level of security may correspond to the commercial transaction and of obtaining, via the fingerprint acquisition system, partial fingerprint data from at least a partial finger touch in the area. The partial fingerprint data may correspond to a touching portion of a finger or a thumb. The control system may be capable of comparing the partial fingerprint data with master fingerprint data of a rightful user and of determining, based at least in part on the comparing process and the level of security, whether to authorize the commercial transaction.
In some examples, the level of security may be based on one or more of a requested payment amount, an amount of available credit, an amount of money to be transferred between accounts, a type of merchandise and or the user's credit score. According to some implementations, the control system may be capable of determining that the level of security indicates that additional data will be required in order to determine whether to authorize the commercial transaction.
Still other innovative aspects of the subject matter described in this disclosure can be implemented in an apparatus that may include a display, a fingerprint acquisition system and a control system. The control system may be capable of controlling the display to present an image indicating an area for a user to touch and of obtaining, via the fingerprint acquisition system, partial fingerprint data from at least a partial finger touch in the area. The partial fingerprint data may correspond to a touching portion of a finger or a thumb. The control system may be capable of comparing the partial fingerprint data with master fingerprint data of a rightful user and of determining, based at least in part on the comparing process, whether to authorize a transaction, start a personalized application, or unlock the apparatus.
In some implementations, the apparatus may include a touch sensing system. The control system may be capable of controlling the display to present one or more purchasing icons on the display. The purchasing icons may correspond to purchasable items. The control system may be capable of receiving, via the touch sensing system, an indication of a dragging movement of the touching portion of the finger or thumb, of controlling the display to move a representation of one of the purchasing icons onto the indicated area, in response to the dragging movement of the touching portion of the finger or thumb, and of determining whether to authorize a transaction.
In some implementations, the control system may be capable of controlling the display to present one or more application icons on the display device. Each of the application icons may correspond to a software application. The control system may be capable of receiving, via the touch sensing system, an indication of a dragging movement of the touching portion of the finger or thumb, of moving a representation of one of the application icons onto the indicated area in response to the dragging movement of the touching portion of the finger or thumb and of determining whether to start an application that corresponds with the representation of one of the application icons.
Still other innovative aspects of the subject matter described in this disclosure can be implemented in an apparatus that may include a display; a touch sensing system; a biometric sensor and a control system. The control system may be capable of controlling the display to present one or more icons and of receiving, via the touch sensing system, an indication that a digit of the user is touching an area of the display device corresponding to one of the presented icons. The control system may be capable of receiving, via the touch sensing system, an indication of a dragging movement of the digit and of controlling the display to move a representation of one of the presented icons onto an area indicating a selection of the icon, in response to the dragging movement of the digit.
The control system may be capable of acquiring biometric information from the digit when the digit is positioned in an area corresponding to the biometric sensor and of invoking a function based on the acquired biometric information. For example, acquiring biometric information may involve obtaining partial fingerprint data from the digit. Invoking the function may involve authorizing a transaction, starting an application or unlocking the apparatus.
Details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims. Note that the relative dimensions of the following figures may not be drawn to scale.
Details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims. Note that the relative dimensions of the following figures may not be drawn to scale. Like reference numbers and designations in the various drawings indicate like elements.
The following description is directed to certain implementations for the purposes of describing the innovative aspects of this disclosure. However, a person having ordinary skill in the art will readily recognize that the teachings herein may be applied in a multitude of different ways. The described implementations may be implemented in any device, apparatus, or system that includes a touch sensing system. In addition, it is contemplated that the described implementations may be included in or associated with a variety of electronic devices such as, but not limited to: mobile telephones, multimedia Internet enabled cellular telephones, mobile television receivers, wireless devices, smartphones, Bluetooth® devices, personal data assistants (PDAs), wireless electronic mail receivers, hand-held or portable computers, netbooks, notebooks, smartbooks, tablets, printers, copiers, scanners, facsimile devices, global positioning system (GPS) receivers/navigators, cameras, digital media players (such as MP3 players), camcorders, game consoles, wrist watches, clocks, calculators, television monitors, flat panel displays, electronic reading devices (e.g., e-readers), mobile health devices, computer monitors, auto displays (including odometer and speedometer displays, etc.), cockpit controls and/or displays, camera view displays (such as the display of a rear view camera in a vehicle), electronic photographs, electronic billboards or signs, projectors, architectural structures, microwaves, refrigerators, stereo systems, cassette recorders or players, DVD players, CD players, VCRs, radios, portable memory chips, washers, dryers, washer/dryers, parking meters, packaging (such as in electromechanical systems (EMS) applications including microelectromechanical systems (MEMS) applications, as well as non-EMS applications), aesthetic structures (such as display of images on a piece of jewelry or clothing) and a variety of EMS devices. The teachings herein also may be used in applications such as, but not limited to, electronic switching devices, radio frequency filters, sensors, accelerometers, gyroscopes, motion-sensing devices, magnetometers, inertial components for consumer electronics, parts of consumer electronics products, varactors, liquid crystal devices, electrophoretic devices, drive schemes, manufacturing processes and electronic test equipment. Thus, the teachings are not intended to be limited to the implementations depicted solely in the Figures, but instead have wide applicability as will be readily apparent to one having ordinary skill in the art.
Some implementations described herein use touch biometrics to authenticate a user of a device, such as a mobile display device. In some implementations, an authentication method may involve presenting an image on a display device indicating an area for a user to touch, e.g., to tap. The image may, for example, be an icon associated with an application or “app” that is presented on a display device. The method may involve obtaining at least partial fingerprint data from one or more finger taps or touches in the area. The partial fingerprint data may correspond to a touching portion of a finger or thumb. As used herein, the term “fingerprint” may refer to a fingerprint or a thumbprint.
The method may involve comparing the partial fingerprint data with master fingerprint data of the rightful user and determining, based at least in part on the comparing process, whether to invoke a function. For example, the master fingerprint data may correspond with a relatively more complete fingerprint image that is stored in a memory of, or accessible by, the display device. The function may, for example, involve authorizing a commercial transaction, starting an app, or unlocking the display device. In some implementations, the function may involve authorizing a transaction based on a level of security.
Some such methods may involve obtaining and using touch biometrics, such as fingerprint data and/or finger tap characteristics, in a manner that is transparent to the user. Fingerprint data, finger tap characteristics and/or other biometric data may be obtained and used to enroll and/or authenticate the user while the user is interacting with an application in a normal fashion, e.g. in the native environment of the application. For example, the method may involve presenting an image (such as an icon) on the display device and prompting a user to touch or tap the image in order to make an electronic payment. The payment may be authenticated using biometric information obtained during the touch without the need for the user to be aware of the process.
Block 110 involves obtaining partial fingerprint data from at least a partial finger touch in the area. Here, the partial fingerprint data corresponds to a touching portion of a finger or a thumb. As used herein, “fingerprint data” may include various types of data known by those of skill in the various fields of fingerprint identification or “dactyloscopy,” including but not limited to finger or thumb friction ridge image data and data used to characterize fingerprint minutiae, such as data corresponding to the types, locations and/or spacing of fingerprint minutiae. Examples of partial fingerprint data are described below, e.g., with reference to
In this example, block 115 involves comparing the partial fingerprint data with master fingerprint data of a rightful user. The master fingerprint data may have been obtained during an enrollment process, during which a rightful user provided “full,” or substantially complete, fingerprint data for one or more fingers and/or thumbs. The terms “full fingerprint data” and “substantially complete fingerprint data” may be used interchangeably herein. These terms may, for example, correspond to fingerprint data that may be obtained by placing a finger or thumb in a substantially flat position over an area corresponding to a fingerprint acquisition system, by “rolling” the finger or thumb over such an area, etc. It will be understood that “full” or “substantially complete” fingerprint data does not necessarily mean fingerprint data corresponding to each and every friction ridge or whorl of a finger or thumb. Some such implementations may involve prompting the rightful user to provide full fingerprint data for at least one finger, associating the full fingerprint data with the rightful user and storing the full fingerprint data in a memory. Such full fingerprint data may be stored as at least part of the master fingerprint data. In some implementations, for example, full fingerprint data for one finger may be aggregated with full fingerprint data for at least one other finger, thumb, etc., as the master fingerprint data. Fingerprint data may include portions of one or more fingertips near the fingernail, representative of where an individual might physically touch a touchscreen of a mobile device.
However, as described below, some implementations involve obtaining, augmenting, adapting and/or updating master fingerprint data while a user is performing other operations with a display device, such as tapping a touch panel while interacting with other software applications on a display device (such as browsing the Internet, using a cellular telephone, making commercial transactions, etc.).
The master fingerprint data may be stored locally, e.g., in a memory of a display device. Alternatively, or additionally, the master fingerprint data may be stored in another device, such as a memory device accessible via a data network. For example, the master fingerprint data may be stored on a memory device of, or a memory device accessible by, a server.
In this example, block 120 involves determining, based at least in part on the comparing process of block 115, whether to invoke a function. Invoking the function may, for example, involve authorizing a transaction such as a commercial transaction. In some implementations, invoking the function may involve starting a personalized application or unlocking the display device. In some implementations, a personalized application may be a personal email account, a personal calendar, or an application displaying a dashboard of a user's physical activity, e.g., number of steps and calories burned that may be measured by an activity sensor worn on the body of the user. In some implementations, the personalized application may be a virtual private network (VPN) and invoking the function may involve establishing the VPN. According to some such implementations, a VPN may be established based only upon the partial fingerprint data, whereas in alternative implementations further information, such as a user ID and/or pass code, may need to be provided and evaluated before the VPN can be established.
In some implementations, block 120 may involve invoking computer software for fingerprint identification, which also may be referred to as fingerprint individualization. Such software may be stored on a non-transitory medium, such as a portion of a memory system of a display device. Alternatively, or additionally, at least some of the related software may be stored in a memory system of another device that the display device may be capable of accessing, e.g., via a data network. Such fingerprint identification software may, for example, include instructions for controlling one or more devices to apply threshold scoring rules to determine whether the master fingerprint data and the partial fingerprint data correspond to the same finger(s) or thumb(s). The scoring rules may, for example, pertain to comparing the types, locations and/or spacing of fingerprint minutiae indicated by the master fingerprint data and the partial fingerprint data.
In some implementations, additional types of authentication data may be evaluated in method 100 and/or other methods described herein. In some such implementations, additional types of authentication data may be evaluated because the determination of whether to invoke the function (block 120 of
For example, in some implementations finger tap characteristic data may be evaluated to determine whether the finger tap characteristic data corresponds with finger tap characteristic data of a rightful user. Finger tap characteristic data may, for example, correspond with a frequency of taps (e.g., as measured by the average time interval between taps) and/or a number of taps (e.g., as measured by the average number of taps during a predetermined time interval, the pressure of the tap or the dwell of the tap). Accordingly, the frequency of taps and/or number of taps can indicate how quickly the user normally taps on the display device, e.g., when interacting with one or more graphic user interfaces displayed on the display device (e.g., when interacting with a keypad).
The frequency of taps and/or number of taps may be determined by a finger tap sensing system. In some implementations, the finger tap sensing apparatus may include a microphone of a display device. In some implementations, the finger tap sensing system may include a touch sensing system of the display device, including but not limited to the types of touch sensing systems described herein.
In some implementations, finger tap characteristic data may be based, at least in part, on an audio signature of the rightful user's finger taps. For example, some users may normally have relatively longer fingernails. The sound produced by tapping on a display device with a fingertip that includes a fingernail will differ from the sound produced by tapping on a display device with a fingertip that does not include a fingernail. Relatively thinner fingers will produce different tapping sounds than relatively fleshy, fat fingers. Larger fingers will tend to produce different tapping sounds than relatively smaller fingers. A microphone of a display device may be used to capture audio data corresponding to a rightful user's tapping sounds, e.g., during an enrollment period or during routine use of the display device.
Based, at least in part, on audio data corresponding to the tapping sounds, a control system of the display device (or of another device) may determine an audio signature of the rightful user's finger taps. For example, a control system may be capable of transforming the audio data from the time domain into the frequency domain. The control system may be capable of dividing the frequency domain data into a predetermined number of frequency ranges and of determining the power corresponding to the audio data in each of the frequency ranges. In such implementations, an audio signature of the rightful user's finger taps may be based, at least in part, on the power in each of the frequency ranges. For example, audio signature of the rightful user's finger taps may be based, at least in part, on the average power in each of the frequency ranges. The resulting audio signature may be used during an authentication process, e.g., by comparing the audio signature of the rightful user's finger taps with an audio signature of a person currently using the display device. In some implementations, a sequence of taps such as tap-tap-pause-tap may be sensed and compared to a stored sequence to determine a rightful user and invoke a function when the sequence is matched.
Accordingly, the determination of whether to invoke a function (in block 120 of
In the implementation shown in
However, in addition to providing functionality for the app, partial fingerprint data may be obtained when the user touches or taps a touching portion of a finger or a thumb in the area of the image 130. Accordingly, this process is an example of block 110 of
For example, block 105 of
Other types of authentication data may be obtained in a similar fashion. For example, such methods may involve obtaining new finger tap characteristic data while the rightful user is using a software application that does not relate to authentication functionality. The determining process of method 100 may involve determining whether to update existing finger tap characteristic data of the rightful user according to the new finger tap characteristic data. Similarly, such methods may involve obtaining new device movement data while the rightful user is using a software application that does not relate to authentication functionality. The determining process of method 100 may involve determining whether to update existing device movement data of the rightful user according to the new device movement data.
Referring again to
Accordingly, in some implementations, block 110 of
In alternative implementations, the image 130 may indicate another area for the user to touch. In the example shown in
Accordingly, in this example the fingerprint acquisition system 135 may include a type of fingerprint sensor that is capable of obtaining fingerprint data through substantially opaque material. In some implementations, for example, the fingerprint acquisition system 135 may include an ultrasonic fingerprint sensor. Examples of display devices having an ultrasonic finger print sensor positioned outside of a display area are described below with reference to
Methods of biometric authorization using a select and drag operation on a display screen may allow safe selection or secure selection when opening a personalized application or purchasing an on-line item so that a user may feel safe or secure. For example, a user may open an email account, access a personal calendar, view a personal stock portfolio, or view a video by simply selecting an appropriate icon and dragging the selected icon to an authenticating region where biometric information may be acquired and the application started or an operation performed. The user may feel very secure when performing such an operation in this manner. Other applications or folders such as those containing personal information may be opened similarly. In other implementations, bio-secure applications or file folders may be selected and accessed with a drag and authenticate operation.
Methods of biometric authorization using a select and drag operation may allow rapid, secure purchases of on-line items. In a manner reminiscent of yet different from a “one-click” purchasing method, a user may select and drag an icon associated with a purchasable item onto an authenticating region of a mobile device in a “one-drag” purchasing method according to one implementation of the present invention.
In alternative arrangements, a user may select an icon on a display device that becomes highlighted, and then touch or partially touch an indicated area on the display device for the acquisition of biometric information. Pending successful matching of the biometric information, an application associated with the selected icon may be started, a selected item may be purchased, or an operation may be performed.
However, in some implementations, the master fingerprint image data may be obtained, at least in part, according to alternative processes. In some such implementations, at least some of the master fingerprint image data may be obtained during routine use of a display device. For example, in some implementations, the partial fingerprint data may include known fingerprint data of the current master fingerprint data and new fingerprint data. Such implementations may involve updating the master fingerprint data to include the new fingerprint data. For example, as the fingerprints of youth grow in size and evolve as the fingers grow, the master fingerprint data may also evolve accordingly. For identification purposes such as school lunch programs, the correct authentication of a user throughout a period of growth during a school year without requiring re-enrollment may be a useful convenience.
Here, block 310 involves determining whether the partial fingerprint data includes known fingerprint data and new fingerprint data. If so, the master fingerprint data may be updated to include the new fingerprint data in block 315.
According to some such implementations, the updating process may involve augmenting the master fingerprint data to include the new fingerprint data. For example, referring to
For example, if the partial fingerprint data obtained were to correspond with the fingerprint images 13 shown in
Alternatively, or additionally, the updating process may involve adapting the master fingerprint data. As a child grows, for example, his or her digits will become larger and the spacing between minutiae will increase. However the types and relative positions of the minutiae may remain substantially the same. Accordingly, in block 310, the partial fingerprint data may be recognized as those of a rightful user, even though the spacing between minutiae may have increased, e.g., beyond a predetermined threshold. Block 315 may involve updating the master fingerprint data by changing, scaling, or otherwise adapting data corresponding to the spacing between at least some of the minutiae. In this example, the process ends in block 320. However, some implementations involve multiple iterations of the blocks shown in
Mobile handheld display devices may be moved, held and touched on in many different ways. Accordingly, various methods described herein can adapt to the many different ways that the same user may interact with his/her device.
In this implementation, block 410 involves determining a level of security corresponding to the commercial transaction. Block 410 may, for example, involve determining a level of security based on a transaction amount, which may correspond with a requested payment amount for the commercial transaction and/or an amount of money to be transferred between accounts. In alternative implementations, the level of security determined in block 410 may be based on various other factors, such as a type of merchandise, an amount of available credit and/or the user's credit score.
In this example, block 415 involves obtaining partial fingerprint data from at least a partial finger touch in the area as presented in block 405. The partial fingerprint data may correspond to a touching portion of a finger or a thumb. Here, block 420 involves comparing the partial fingerprint data with master fingerprint data of a rightful user. In this implementation, block 425 involves determining, based at least in part on the comparing process and the level of security, whether to authorize the commercial transaction.
In some implementations, method 400 (and/or other methods described herein) may involve determining that additional data will be required in order to determine whether to authorize the commercial transaction. The additional data may include full fingerprint data for at least one finger, a finger tap characteristic, device movement data, other authentication data, or a combination thereof. Some examples are provided below.
In alternative implementations, the lowest level of security may correspond to other authentication data, including but not limited to the other types of authentication data shown in
As shown in
Each user may have habitual or characteristic ways of moving the display device, including but not limited to the rotation angle, the rotational velocity and/or acceleration associated with the above-described device movement. A user also may have characteristic ways of holding and/or moving the display device when using it, such as characteristic viewing angles, characteristic tapping forces, characteristic tapping directions, etc. For example, some users may tend to use a “landscape” view, others may prefer a “portrait” view and others may switch between such views. Tapping with a left thumb will tend to produce different device movements than tapping with a right thumb or tapping with an index finger. Tapping a display device that is lying on a surface, such as a desktop, will tend to produce different device movements than tapping a display device held in the hand.
The corresponding device movement data may be detected by one or more motion sensors of a motion sensor system, e.g., by one or more gyroscopes and/or accelerometers of a motion sensor system. In some implementations, some device movements (e.g., of the type shown in
The control system 50 may include one or more general purpose single- or multi-chip processors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs) or other programmable logic devices, discrete gates or transistor logic, discrete hardware components, or combinations thereof. The control system 50 also may include (and/or be configured for communication with) one or more memory devices, such as one or more random access memory (RAM) devices, read-only memory (ROM) devices, etc.
The control system 50 may be capable of controlling the display to present an image indicating an area for a user to touch and of controlling the fingerprint acquisition system to obtain partial fingerprint data from at least a partial finger touch in the area. The partial fingerprint data may correspond to a touching portion of a finger or a thumb. The control system 50 may be capable of comparing the partial fingerprint data with master fingerprint data of a rightful user and determining, based at least in part on the comparing process, whether to invoke a function. Invoking the function may, for example, involve authorizing a transaction, starting a personalized application, or unlocking the display device.
The partial fingerprint data may, in some instances, include known fingerprint data of the current master fingerprint data and new fingerprint data. In some implementations, the control system may be capable of updating the master fingerprint data to include the new fingerprint data. For example, the control system may be capable of augmenting the master fingerprint data and/or adapting the master fingerprint data.
The fingerprint acquisition system 135 may be any suitable fingerprint acquisition system, including but not limited to the examples described herein. In some implementations, the fingerprint acquisition system 135 may include an ultrasonic imaging system. For example, the fingerprint acquisition system 135 may include an ultrasonic sensor array and an ultrasonic transmitter. According to some implementations, the obtaining process may involve obtaining the partial fingerprint data via the ultrasonic sensor array while maintaining the ultrasonic transmitter in an “off” state. In some examples, the fingerprint acquisition system 135 may be positioned within a display area or, at least in part, outside the display area.
In some implementations, the display device 1340 may include a motion sensor system 520. The motion sensor system 520 may be capable of sensing device movement and providing device movement data to the control system. The control system may be capable of determining whether the device movement data corresponds with device movement data of the rightful user. The process of determining whether to invoke the function may be based, at least in part, on whether the device movement data corresponds with device movement data of the rightful user.
In some examples, the display device 1340 may include a finger tap sensing system 530. The finger tap sensing system 530 may include one or more microphones. In some implementations, the finger tap sensing system 530 may include one or more components of the fingerprint acquisition system 135 and/or one or more components of a touch sensing system.
The control system may be capable of receiving, from the finger tap sensing system 530, information regarding one or more finger taps. The control system may be capable of determining finger tap characteristic data based on the finger tap information. For example, the finger tap characteristic data may corresponds with a number of taps, a frequency of taps and/or an auditory signature. The process of determining whether to invoke the function may be based, at least in part, on comparing the finger tap characteristic data with finger tap characteristic data of the rightful user.
Various implementations described herein relate to touch sensing systems that include a pressure and force sensing device capable of sensing dynamic pressure or dynamic force. For the sake of simplicity, such a pressure and force sensing device may be referred to herein simply as a “force-sensing device.” Similarly, an applied pressure and force may be referred to herein simply as an “applied force” or the like, with the understanding that applying force with a physical object will also involve applying pressure. In some implementations, the touch sensing system may include a piezoelectric sensing array. In such implementations, an applied force may be detected (and optionally recorded) during a period of time that the force is applied and changing. In some implementations, the force-sensing device may have a sufficiently high resolution to function as a fingerprint sensor.
In some implementations, the touch sensing system may include one or more additional components capable of fingerprint sensing, such as an ultrasonic transmitter that allows the device to become an ultrasonic transducer capable of imaging a finger in detail. In some such implementations, the force-sensing device also may be capable of functioning as an ultrasonic receiver to detect acoustic or ultrasonic energy such as acoustic emissions from a tap on the surface of the sensing system or ultrasonic waves reflected from the surface.
In the example shown in
Force applied by the object 25, which is a finger in this example, may squeeze or otherwise deform at least some of the discrete elements 37 of the piezoelectric layer 36. The receiver bias electrode 39 and the pixel input electrodes 38 allow the array of sensor pixels 32 to measure the electrical charge generated on the surfaces of the discrete elements 37 of the piezoelectric layer 36 that result from the deformation of the discrete elements 37.
The control system 50 may include one or more general purpose single- or multi-chip processors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs) or other programmable logic devices, discrete gates or transistor logic, discrete hardware components, or combinations thereof. The control system 50 also may include (and/or be configured for communication with) one or more memory devices, such as one or more random access memory (RAM) devices, read-only memory (ROM) devices, etc. The control system 50 may be capable of determining a location in which the object 25 is exerting a force on the force-sensing device 30 according to signals provided by multiple sensor pixels 32. In some implementations, the control system 50 may be capable of determining locations and/or movements of multiple objects 25. According to some such implementations, the control system 50 may be capable of controlling a device according to one or more determined locations and/or movements. For example, in some implementations, the control system 50 may be capable of controlling a mobile display device, such as the display device 1340 shown in
According to some implementations, the force-sensing device 30 may have a sufficiently high resolution for the touch sensing system 10 to function as a fingerprint sensor. In some implementations, some of which are described below, the touch sensing system 10 may include an ultrasonic transmitter and the force-sensing device 30 may be capable of functioning as an ultrasonic receiver. The control system 50 may be capable of controlling the ultrasonic transmitter and/or the force-sensing device 30 to obtain fingerprint image data, e.g., by capturing fingerprint images. Whether or not the touch sensing system 10 includes an ultrasonic transmitter, the control system 50 may be capable of controlling access to one or more devices based, at least in part, on the fingerprint image data.
In some implementations, the control system 50 may be capable of operating the touch sensing system in an ultrasonic imaging mode or a force-sensing mode. In some implementations, the control system may be capable of maintaining the ultrasonic transmitter in an “off” state when operating the touch sensing system in a force-sensing mode.
In this example, the reset device 9 is capable of resetting the peak detection circuit 8 after reading the charge, making the peak detection circuit 8 ready for reading subsequent charges from the charge amplifier 7. In some implementations, addressing and/or resetting functionality may be provided by TFTs of the TFT substrate 34. A readout transistor for each row or column may be triggered to allow the magnitude of the peak charge for each pixel to be read by additional circuitry not shown in
The elements of the force-sensing device 30 shown in
In some implementations, the touch sensing system 10 may include one or more additional components, such as an ultrasonic transmitter that allows the touch sensing system 10 to function as an ultrasonic transducer capable of imaging a finger in detail. In some such implementations, the force-sensing device 30 may be capable of functioning as an ultrasonic receiver.
The ultrasonic transmitter 20 may be a piezoelectric transmitter that can generate ultrasonic waves 21 (see
In the example shown in
As shown in
The force-sensing device 30 may include an array of sensor pixel circuits 32 disposed on a substrate 34, which also may be referred to as a backplane, and a piezoelectric film layer 36. In some implementations, each sensor pixel circuit 32 may include one or more TFT elements and, in some implementations, one or more additional circuit elements such as diodes, capacitors, and the like. Each sensor pixel circuit 32 may be configured to convert an electric charge generated in the piezoelectric film layer 36 proximate to the pixel circuit into an electrical signal. Each sensor pixel circuit 32 may include a pixel input electrode 38 that electrically couples the piezoelectric film layer 36 to the sensor pixel circuit 32.
In the illustrated implementation, a receiver bias electrode 39 is disposed on a side of the piezoelectric film layer 36 proximal to platen 40. The receiver bias electrode 39 may be a metallized electrode and may be grounded or biased to control which signals may be passed to the array of sensor pixel circuits 32. Ultrasonic energy that is reflected from the exposed (top) surface 42 of the platen 40 may be converted into localized electrical charges by the piezoelectric film layer 36. These localized charges may be collected by the pixel input electrodes 38 and passed on to the underlying sensor pixel circuits 32. The charges may be amplified by the sensor pixel circuits 32 and then provided to the control system 50. Simplified examples of sensor pixel circuits 32 are shown in
The control system 50 may be electrically connected (directly or indirectly) with the first transmitter electrode 24 and the second transmitter electrode 26, as well as with the receiver bias electrode 39 and the sensor pixel circuits 32 on the substrate 34. In some implementations, the control system 50 may operate substantially as described above. For example, the control system 50 may be capable of processing the amplified signals received from the sensor pixel circuits 32.
The control system 50 may be capable of controlling the ultrasonic transmitter 20 and/or the force-sensing device 30 to obtain fingerprint image data, e.g., by obtaining fingerprint images. Whether or not the touch sensing system 10 includes an ultrasonic transmitter 20, the control system 50 may be capable of controlling access to one or more devices based, at least in part, on the fingerprint image data. The touch sensing system 10 (or an associated device) may include a memory system that includes one or more memory devices. In some implementations, the control system 50 may include at least a portion of the memory system. The control system 50 may be capable of capturing a fingerprint image and storing fingerprint image data in the memory system. In some implementations, the control system 50 may be capable of capturing a fingerprint image and storing fingerprint image data in the memory system even while maintaining the ultrasonic transmitter 20 in an “off” state.
In some implementations, the control system 50 may be capable of operating the touch sensing system in an ultrasonic imaging mode or a force-sensing mode. In some implementations, the control system may be capable of maintaining the ultrasonic transmitter 20 in an “off” state when operating the touch sensing system in a force-sensing mode. The force-sensing device 30 may be capable of functioning as an ultrasonic receiver when the touch sensing system 10 is operating in the ultrasonic imaging mode.
In some implementations, the control system 50 may be capable of controlling other devices, such as a display system, a communication system, etc. In some implementations, for example, the control system 50 may be capable of powering on one or more components of a device such as the display device 1340, which is described below with reference to
The platen 40 can be any appropriate material that can be acoustically coupled to the receiver, with examples including plastic, ceramic, sapphire and glass. In some implementations, the platen 40 can be a cover plate, e.g., a cover glass or a lens glass for a display. Particularly when the ultrasonic transmitter 20 is in use, fingerprint detection and imaging can be performed through relatively thick platens if desired, e.g., 3 mm and above. However, for implementations in which the force-sensing device 30 is capable of imaging fingerprints in a force detection mode, a thinner and relatively more compliant platen 40 may be desirable. According to some such implementations, the platen 40 may include one or more polymers, such as one or more types of parylene, and may be substantially thinner. In some such implementations, the platen 40 may be tens of microns thick or even less than 10 microns thick.
Examples of piezoelectric materials that may be used to form the piezoelectric film layer 36 include piezoelectric polymers having appropriate acoustic properties, for example, an acoustic impedance between about 2.5 MRayls and 5 MRayls. Specific examples of piezoelectric materials that may be employed include ferroelectric polymers such as polyvinylidene fluoride (PVDF) and polyvinylidene fluoride-trifluoroethylene (PVDF-TrFE) copolymers. Examples of PVDF copolymers include 60:40 (molar percent) PVDF-TrFE, 70:30 PVDF-TrFE, 80:20 PVDF-TrFE, and 90:10 PVDR-TrFE. Other examples of piezoelectric materials that may be employed include polyvinylidene chloride (PVDC) homopolymers and copolymers, polytetrafluoroethylene (PTFE) homopolymers and copolymers, and diisopropylammonium bromide (DIPAB).
The thickness of each of the piezoelectric transmitter layer 22 and the piezoelectric film layer 36 may be selected so as to be suitable for generating and receiving ultrasonic waves. In one example, a PVDF piezoelectric transmitter layer 22 is approximately 28 μm thick and a PVDF-TrFE receiver layer 36 is approximately 12 μm thick. Example frequencies of the ultrasonic waves may be in the range of 5 MHz to 30 MHz, with wavelengths on the order of a quarter of a millimeter or less.
Also depicted in
It is to be understood that the components shown in
In the implementation shown in
A more integrated version of the display module 1100 is depicted in
One notable difference between
The configurations shown in
As can be seen, the sensor pixel array 1238 in
In this example, the display device 1340 includes a housing 1341, a display 1330, a touch sensing system 10, an antenna 1343, a speaker 1345, an input device 1348 and a microphone 1346. The housing 1341 may be formed from any of a variety of manufacturing processes, including injection molding, and vacuum forming. In addition, the housing 1341 may be made from any of a variety of materials, including, but not limited to: plastic, metal, glass, rubber and ceramic, or a combination thereof. The housing 1341 may include removable portions (not shown) that may be interchanged with other removable portions of different color, or containing different logos, pictures, or symbols.
The display 1330 may be any of a variety of displays, including a flat-panel display, such as plasma, organic light-emitting diode (OLED) or liquid crystal display (LCD), or a non-flat-panel display, such as a cathode ray tube (CRT) or other tube device. In addition, the display 1330 may include an interferometric modulator (IMOD)-based display or a micro-shutter based display.
The components of one example of the display device 1340 are schematically illustrated in
In this example, the display device 1340 also includes a touch and fingerprint controller 1377. The touch and fingerprint controller 1377 may, for example, be a part of a control system 50 such as that described above. Accordingly, in some implementations the touch and fingerprint controller 1377 (and/or other components of the control system 50) may include one or more memory devices. In some implementations, the control system 50 also may include components such as the processor 1321, the array driver 1322 and/or the driver controller 1329 shown in
The touch and fingerprint controller 1377 (and/or another element of the control system 50) may be capable of providing input for controlling the display device 1340 according to one or more touch locations. In some implementations, the touch and fingerprint controller 1377 may be capable of determining movements of one or more touch locations and of providing input for controlling the display device 1340 according to the movements. Alternatively, or additionally, the touch and fingerprint controller 1377 may be capable of determining locations and/or movements of objects that are proximate the display device 1340. Accordingly, the touch and fingerprint controller 1377 may be capable of detecting finger or stylus movements, hand gestures, etc., even if no contact is made with the display device 40. The touch and fingerprint controller 1377 may be capable of providing input for controlling the display device 40 according to such detected movements and/or gestures.
As described elsewhere herein, the touch and fingerprint controller 1377 (or another element of the control system 50) may be capable of providing one or more fingerprint detection operational modes. Accordingly, in some implementations the touch and fingerprint controller 1377 (or another element of the control system 50) may be capable of producing fingerprint images.
In some implementations, the touch sensing system 10 may include a force-sensing device 30 and/or an ultrasonic transmitter 20 such as described elsewhere herein. According to some such implementations, the touch and fingerprint controller 1377 (or another element of the control system 50) may be capable of receiving input from the force-sensing device 30 and powering on or “waking up” the ultrasonic transmitter 20 and/or another component of the display device 1340.
The network interface 1327 includes the antenna 1343 and the transceiver 1347 so that the display device 1340 may communicate with one or more devices over a network. The network interface 1327 also may have some processing capabilities to relieve, for example, data processing requirements of the processor 1321. The antenna 1343 may transmit and receive signals. In some implementations, the antenna 1343 transmits and receives RF signals according to the IEEE 16.11 standard, including IEEE 16.11(a), (b), or (g), or the IEEE 802.11 standard, including IEEE 802.11a, b, g, n, and further implementations thereof. In some other implementations, the antenna 1343 transmits and receives RF signals according to the Bluetooth® standard. In the case of a cellular telephone, the antenna 1343 may be designed to receive code division multiple access (CDMA), frequency division multiple access (FDMA), time division multiple access (TDMA), Global System for Mobile communications (GSM), GSM/General Packet Radio Service (GPRS), Enhanced Data GSM Environment (EDGE), Terrestrial Trunked Radio (TETRA), Wideband-CDMA (W-CDMA), Evolution Data Optimized (EV-DO), 1xEV-DO, EV-DO Rev A, EV-DO Rev B, High Speed Packet Access (HSPA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Evolved High Speed Packet Access (HSPA+), Long Term Evolution (LTE), AMPS, or other known signals that are used to communicate within a wireless network, such as a system utilizing 3G, 4G or 5G technology. The transceiver 1347 may pre-process the signals received from the antenna 1343 so that they may be received by and further manipulated by the processor 1321. The transceiver 1347 also may process signals received from the processor 1321 so that they may be transmitted from the display device 1340 via the antenna 1343.
In some implementations, the transceiver 1347 may be replaced by a receiver. In addition, in some implementations, the network interface 1327 may be replaced by an image source, which may store or generate image data to be sent to the processor 1321. The processor 1321 may control the overall operation of the display device 1340. The processor 1321 receives data, such as compressed image data from the network interface 1327 or an image source, and processes the data into raw image data or into a format that may be readily processed into raw image data. The processor 1321 may send the processed data to the driver controller 1329 or to the frame buffer 1328 for storage. Raw data typically refers to the information that identifies the image characteristics at each location within an image. For example, such image characteristics may include color, saturation and gray-scale level.
The processor 1321 may include a microcontroller, CPU, or logic unit to control operation of the display device 1340. The conditioning hardware 1352 may include amplifiers and filters for transmitting signals to the speaker 1345, and for receiving signals from the microphone 1346. The conditioning hardware 1352 may be discrete components within the display device 1340, or may be incorporated within the processor 1321 or other components.
The driver controller 1329 may take the raw image data generated by the processor 1321 either directly from the processor 1321 or from the frame buffer 1328 and may re-format the raw image data appropriately for high speed transmission to the array driver 1322. In some implementations, the driver controller 1329 may re-format the raw image data into a data flow having a raster-like format, such that it has a time order suitable for scanning across the display array 1330. Then the driver controller 1329 sends the formatted information to the array driver 1322. Although a driver controller 1329, such as an LCD controller, is often associated with the system processor 1321 as a stand-alone Integrated Circuit (IC), such controllers may be implemented in many ways. For example, controllers may be embedded in the processor 1321 as hardware, embedded in the processor 1321 as software, or fully integrated in hardware with the array driver 1322.
The array driver 1322 may receive the formatted information from the driver controller 1329 and may re-format the video data into a parallel set of waveforms that are applied many times per second to the hundreds, and sometimes thousands (or more), of leads coming from the display's x-y matrix of display elements.
In some implementations, the driver controller 1329, the array driver 1322, and the display array 1330 are appropriate for any of the types of displays described herein. For example, the driver controller 1329 may be a conventional display controller or a bi-stable display controller (such as an IMOD display element controller). Additionally, the array driver 1322 may be a conventional driver or a bi-stable display driver. Moreover, the display array 1330 may be a conventional display array or a bi-stable display. In some implementations, the driver controller 1329 may be integrated with the array driver 1322. Such an implementation may be useful in highly integrated systems, for example, mobile phones, portable-electronic devices, watches or small-area displays.
In some implementations, the input device 1348 may be capable of allowing, for example, a user to control the operation of the display device 1340. The input device 1348 may include a keypad, such as a QWERTY keyboard or a telephone keypad, a button, a switch, a rocker, a touch-sensitive screen, a touch-sensitive screen integrated with the display array 1330, or a pressure- or heat-sensitive membrane. The microphone 1346 may be capable of functioning as an input device for the display device 1340. In some implementations, voice commands through the microphone 1346 may be used for controlling operations of the display device 1340.
The power supply 1350 may include a variety of energy storage devices. For example, the power supply 1350 may be a rechargeable battery, such as a nickel-cadmium battery or a lithium-ion battery. In implementations using a rechargeable battery, the rechargeable battery may be chargeable using power coming from, for example, a wall socket or a photovoltaic device or array. Alternatively, the rechargeable battery may be wirelessly chargeable. The power supply 1350 also may be a renewable energy source, a capacitor, or a solar cell, including a plastic solar cell or solar-cell paint. The power supply 1350 also may be capable of receiving power from a wall outlet.
In some implementations, control programmability resides in the driver controller 1329 which may be located in several places in the electronic display system. In some other implementations, control programmability resides in the array driver 1322. The above-described optimization may be implemented in any number of hardware and/or software components and in various configurations.
As used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover: a, b, c, a-b, a-c, b-c, and a-b-c.
The various illustrative logics, logical blocks, modules, circuits and algorithm processes described in connection with the implementations disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. The interchangeability of hardware and software has been described generally, in terms of functionality, and illustrated in the various illustrative components, blocks, modules, circuits and processes described above. Whether such functionality is implemented in hardware or software depends upon the particular application and design constraints imposed on the overall system.
The hardware and data processing apparatus used to implement the various illustrative logics, logical blocks, modules and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, or, any conventional processor, controller, microcontroller, or state machine. A processor also may be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In some implementations, particular processes and methods may be performed by circuitry that is specific to a given function.
In one or more aspects, the functions described may be implemented in hardware, digital electronic circuitry, computer software, firmware, including the structures disclosed in this specification and their structural equivalents thereof, or in any combination thereof. Implementations of the subject matter described in this specification also may be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a computer storage media for execution by, or to control the operation of, data processing apparatus.
If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium, such as a non-transitory medium. The processes of a method or algorithm disclosed herein may be implemented in a processor-executable software module which may reside on a computer-readable medium. Computer-readable media include both computer storage media and communication media including any medium that may be enabled to transfer a computer program from one place to another. Storage media may be any available media that may be accessed by a computer. By way of example, and not limitation, non-transitory media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer. Also, any connection may be properly termed a computer-readable medium. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and instructions on a machine readable medium and computer-readable medium, which may be incorporated into a computer program product.
Various modifications to the implementations described in this disclosure may be readily apparent to those having ordinary skill in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the disclosure is not intended to be limited to the implementations shown herein, but is to be accorded the widest scope consistent with the claims, the principles and the novel features disclosed herein. The word “exemplary” is used exclusively herein, if at all, to mean “serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations.
Certain features that are described in this specification in the context of separate implementations also may be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation also may be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems may generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims may be performed in a different order and still achieve desirable results.
It will be understood that unless features in any of the particular described implementations are expressly identified as incompatible with one another or the surrounding context implies that they are mutually exclusive and not readily combinable in a complementary and/or supportive sense, the totality of this disclosure contemplates and envisions that specific features of those complementary implementations may be selectively combined to provide one or more comprehensive, but slightly different, technical solutions. It will therefore be further appreciated that the above description has been given by way of example only and that modifications in detail may be made within the scope of this disclosure.
This application claims priority to U.S. Provisional Application No. 61/900,851, filed on Nov. 6, 2013 and entitled “USER AUTHENTICATION BIOMETRICS IN MOBILE DEVICES,” which is hereby incorporated by reference. This application also claims priority to U.S. Provisional Application No. 61/830,582, filed on Jun. 3, 2013 and entitled “DISPLAY WITH PERIPHERALLY CONFIGURED ULTRASONIC BIOMETRIC SENSOR,” which is hereby incorporated by reference. This application also claims priority to U.S. application Ser. No. 14/071,320, filed on Nov. 4, 2013 and entitled “PIEZOELECTRIC FORCE SENSING ARRAY,” which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
61900851 | Nov 2013 | US | |
61830582 | Jun 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14071320 | Nov 2013 | US |
Child | 14178156 | US |